GANs are designed to learn a single distribution, though multiple distributions can be modeled by treating them separately. However, this naive implementation does not consider overlapping distributions. We propose Mixed Membership Generative Adversarial Networks (MMGAN) analogous to mixed-membership models that model multiple distributions and discover their commonalities and particularities. Each data distribution is modeled as a mixture over a common set of generator distributions, and mixture weights are automatically learned from the data. Mixture weights can give insight into common and unique features of each data distribution. We evaluate our proposed MMGAN and show its effectiveness on MNIST and Fashion-MNIST with various settings.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - AME Programmatic Funds
Grant Reference no. : A20H6b0151