Quantum machine learning (QML) has received increasing attention due to its
potential to outperform classical machine learning methods in problems
pertaining classification and identification tasks. A subclass of QML methods
is quantum generative adversarial networks (QGANs) which have been studied as a
quantum counterpart of classical GANs widely used in image manipulation and
generation tasks. The existing work on QGANs is still limited to small-scale
proof-of-concept examples based on images with significant downscaling. Here we
integrate classical and quantum techniques to propose a new hybrid
quantum-classical GAN framework. We demonstrate its superior learning
capabilities by generating 28×28 pixels grey-scale images without
dimensionality reduction or classical pre/post-processing on multiple classes
of the standard MNIST and Fashion MNIST datasets, which achieves comparable
results to classical frameworks with three orders of magnitude less trainable
generator parameters. To gain further insight into the working of our hybrid
approach, we systematically explore the impact of its parameter space by
varying the number of qubits, the size of image patches, the number of layers
in the generator, the shape of the patches and the choice of prior
distribution. Our results show that increasing the quantum generator size
generally improves the learning capability of the network. The developed
framework provides a foundation for future design of QGANs with optimal
parameter set tailored for complex image generation tasks