Classify and generate: Using classification latent space representations for image generations

Page view(s)
64
Checked on Nov 11, 2024
Classify and generate: Using classification latent space representations for image generations
Title:
Classify and generate: Using classification latent space representations for image generations
Journal Title:
Neurocomputing
Publication Date:
02 November 2021
Citation:
Gopalakrishnan, S., Singh, P. R., Yazici, Y., Foo, C.-S., Chandrasekhar, V., & Ambikapathi, A. (2022). Classify and generate: Using classification latent space representations for image generations. Neurocomputing, 471, 296–334. doi:10.1016/j.neucom.2021.10.090
Abstract:
Utilization of classification latent space information for downstream reconstruction and generation is an intriguing and a relatively unexplored area. In general, discriminative representations are rich in class specific features but are too sparse for reconstruction, whereas, in autoencoders the representations are dense but has limited indistinguishable class specific features, making it less suitable for classification. In this work, we propose a discriminative modelling framework that employs manipulated supervised latent representations to reconstruct and generate new samples belonging to a given class. Unlike generative modelling approaches such as GANs and VAEs that aim to model the data manifold distribution, Representation based Generations (ReGene) directly represents the given data manifold in the classification space. Such supervised representations, under certain constraints, allow for reconstructions and controlled generations using an appropriate decoder without enforcing any prior distribution. Theoretically, given a class, we show that these representations when smartly manipulated using convex combinations retain the same class label. Furthermore, they also lead to novel generation of visually realistic images. Extensive experiments on datasets of varying resolutions demonstrate that ReGene has higher classification accuracy than existing conditional generative models while being competitive in terms of FID.
License type:
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Funding Info:
There was no specific funding for the research done
Description:
ISSN:
0925-2312
Files uploaded: