4,284 research outputs found
Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models
Adversarial learning of probabilistic models has recently emerged as a
promising alternative to maximum likelihood. Implicit models such as generative
adversarial networks (GAN) often generate better samples compared to explicit
models trained by maximum likelihood. Yet, GANs sidestep the characterization
of an explicit density which makes quantitative evaluations challenging. To
bridge this gap, we propose Flow-GANs, a generative adversarial network for
which we can perform exact likelihood evaluation, thus supporting both
adversarial and maximum likelihood training. When trained adversarially,
Flow-GANs generate high-quality samples but attain extremely poor
log-likelihood scores, inferior even to a mixture model memorizing the training
data; the opposite is true when trained by maximum likelihood. Results on MNIST
and CIFAR-10 demonstrate that hybrid training can attain high held-out
likelihoods while retaining visual fidelity in the generated samples.Comment: AAAI 201
- …