2 research outputs found
Latent Dirichlet Allocation in Generative Adversarial Networks
We study the problem of multimodal generative modelling of images based on
generative adversarial networks (GANs). Despite the success of existing
methods, they often ignore the underlying structure of vision data or its
multimodal generation characteristics. To address this problem, we introduce
the Dirichlet prior for multimodal image generation, which leads to a new
Latent Dirichlet Allocation based GAN (LDAGAN). In detail, for the generative
process modelling, LDAGAN defines a generative mode for each sample,
determining which generative sub-process it belongs to. For the adversarial
training, LDAGAN derives a variational expectation-maximization (VEM) algorithm
to estimate model parameters. Experimental results on real-world datasets have
demonstrated the outstanding performance of LDAGAN over other existing GANs
A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models
Generative models produce realistic objects in many domains, including text,
image, video, and audio synthesis. Most popular models---Generative Adversarial
Networks (GANs) and Variational Autoencoders (VAEs)---usually employ a standard
Gaussian distribution as a prior. Previous works show that the richer family of
prior distributions may help to avoid the mode collapse problem in GANs and to
improve the evidence lower bound in VAEs. We propose a new family of prior
distributions---Tensor Ring Induced Prior (TRIP)---that packs an exponential
number of Gaussians into a high-dimensional lattice with a relatively small
number of parameters. We show that these priors improve Fr\'echet Inception
Distance for GANs and Evidence Lower Bound for VAEs. We also study generative
models with TRIP in the conditional generation setup with missing conditions.
Altogether, we propose a novel plug-and-play framework for generative models
that can be utilized in any GAN and VAE-like architectures.Comment: NeurIPS 2019; GitHub: https://github.com/insilicomedicine/TRI