Attribute Manipulation Generative Adversarial Networks for Fashion Images

Page view(s)
17
Checked on Mar 25, 2024
Attribute Manipulation Generative Adversarial Networks for Fashion Images
Title:
Attribute Manipulation Generative Adversarial Networks for Fashion Images
Journal Title:
International Conference on Computer Vision (ICCV)
Publication Date:
27 October 2019
Citation:
Abstract:
Recent advances in Generative Adversarial Networks (GANs) have made it possible to conduct multi-domain image-to-image translation using a single generative network. While recent methods such as Ganimation and SaGAN are able to conduct translations on attribute-relevant regions using attention, they do not perform well when the number of attributes increases as the training of attention masks mostly rely on classification losses. To address this and other limitations, we introduce Attribute Manipulation Generative Adversarial Networks (AMGAN) for fashion images. While AMGAN's generator network uses class activation maps (CAMs) to empower its attention mechanism, it also exploits perceptual losses by assigning reference (target) images based on attribute similarities. AMGAN incorporates an additional discriminator network that focuses on attribute-relevant regions to detect unrealistic translations. Additionally, AMGAN can be controlled to perform attribute manipulations on specific regions such as the sleeve or torso regions. Experiments show that AMGAN outperforms state-of-the-art methods using traditional evaluation metrics as well as an alternative one that is based on image retrieval.
License type:
http://creativecommons.org/licenses/by-nc-nd/4.0/
Funding Info:
There is no specific funding for this work.
Description:
“© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.”
ISSN:
2380-7504
ISBN:
978-1-7281-4803-8
Files uploaded:

File Size Format Action
iccv-2019-final.pdf 1.08 MB PDF Open