In this paper we study algorithms to find a Gaussian approximation to a
target measure defined on a Hilbert space of functions; the target measure
itself is defined via its density with respect to a reference Gaussian measure.
We employ the Kullback-Leibler divergence as a distance and find the best
Gaussian approximation by minimizing this distance. It then follows that the
approximate Gaussian must be equivalent to the Gaussian reference measure,
defining a natural function space setting for the underlying calculus of
variations problem. We introduce a computational algorithm which is
well-adapted to the required minimization, seeking to find the mean as a
function, and parameterizing the covariance in two different ways: through low
rank perturbations of the reference covariance; and through Schr\"odinger
potential perturbations of the inverse reference covariance. Two applications
are shown: to a nonlinear inverse problem in elliptic PDEs, and to a
conditioned diffusion process. We also show how the Gaussian approximations we
obtain may be used to produce improved pCN-MCMC methods which are not only
well-adapted to the high-dimensional setting, but also behave well with respect
to small observational noise (resp. small temperatures) in the inverse problem
(resp. conditioned diffusion).Comment: 28 page