In this work, we present and study Continuous Generative Neural Networks
(CGNNs), namely, generative models in the continuous setting: the output of a
CGNN belongs to an infinite-dimensional function space. The architecture is
inspired by DCGAN, with one fully connected layer, several convolutional layers
and nonlinear activation functions. In the continuous L2 setting, the
dimensions of the spaces of each layer are replaced by the scales of a
multiresolution analysis of a compactly supported wavelet. We present
conditions on the convolutional filters and on the nonlinearity that guarantee
that a CGNN is injective. This theory finds applications to inverse problems,
and allows for deriving Lipschitz stability estimates for (possibly nonlinear)
infinite-dimensional inverse problems with unknowns belonging to the manifold
generated by a CGNN. Several numerical simulations, including signal
deblurring, illustrate and validate this approach.Comment: 40 pages, 8 figure