A Unified GAN Framework Regarding Manifold Alignment for Remote Sensing Images Generation

Abstract

Generative Adversarial Networks (GANs) and their variants have achieved remarkable success on natural images. However, their performance degrades when applied to remote sensing (RS) images, and the discriminator often suffers from the overfitting problem. In this paper, we examine the differences between natural and RS images and find that the intrinsic dimensions of RS images are much lower than those of natural images. As the discriminator is more susceptible to overfitting on data with lower intrinsic dimension, it focuses excessively on local characteristics of RS training data and disregards the overall structure of the distribution, leading to a faulty generation model. In respond, we propose a novel approach that leverages the real data manifold to constrain the discriminator and enhance the model performance. Specifically, we introduce a learnable information-theoretic measure to capture the real data manifold. Building upon this measure, we propose manifold alignment regularization, which mitigates the discriminator's overfitting and improves the quality of generated samples. Moreover, we establish a unified GAN framework for manifold alignment, applicable to both supervised and unsupervised RS image generation tasks

    Similar works

    Full text

    thumbnail-image

    Available Versions