Despite its success, generative adversarial networks (GANs) still suffer from
mode collapse, i.e., the generator can only map latent variables to a partial
set of modes in the target distribution. In this paper, we analyze and seek to
regularize this issue with an independent and identically distributed (IID)
sampling perspective and emphasize that holding the IID property referring to
the target distribution for generation can naturally avoid mode collapse. This
is based on the basic IID assumption for real data in machine learning.
However, though the source samples {z} obey IID, the generations {G(z)} may not
necessarily be IID sampling from the target distribution. Based on this
observation, considering a necessary condition of IID generation that the
inverse samples from target data should also be IID in the source distribution,
we propose a new loss to encourage the closeness between inverse samples of
real data and the Gaussian source in latent space to regularize the generation
to be IID from the target distribution. Experiments on both synthetic and
real-world data show the effectiveness of our model.Comment: Accepted in IJCAI 202