Gaussian processes (GPs), implemented through multivariate Gaussian
distributions for a finite collection of data, are the most popular approach in
small-area spatial statistical modelling. In this context they are used to
encode correlation structures over space and can generalise well in
interpolation tasks. Despite their flexibility, off-the-shelf GPs present
serious computational challenges which limit their scalability and practical
usefulness in applied settings. Here, we propose a novel, deep generative
modelling approach to tackle this challenge, termed PriorVAE: for a particular
spatial setting, we approximate a class of GP priors through prior sampling and
subsequent fitting of a variational autoencoder (VAE). Given a trained VAE, the
resultant decoder allows spatial inference to become incredibly efficient due
to the low dimensional, independently distributed latent Gaussian space
representation of the VAE. Once trained, inference using the VAE decoder
replaces the GP within a Bayesian sampling framework. This approach provides
tractable and easy-to-implement means of approximately encoding spatial priors
and facilitates efficient statistical inference. We demonstrate the utility of
our VAE two stage approach on Bayesian, small-area estimation tasks