8 research outputs found
A Tutorial on Sparse Gaussian Processes and Variational Inference
Gaussian processes (GPs) provide a framework for Bayesian inference that can
offer principled uncertainty estimates for a large range of problems. For
example, if we consider regression problems with Gaussian likelihoods, a GP
model enjoys a posterior in closed form. However, identifying the posterior GP
scales cubically with the number of training examples and requires to store all
examples in memory. In order to overcome these obstacles, sparse GPs have been
proposed that approximate the true posterior GP with pseudo-training examples.
Importantly, the number of pseudo-training examples is user-defined and enables
control over computational and memory complexity. In the general case, sparse
GPs do not enjoy closed-form solutions and one has to resort to approximate
inference. In this context, a convenient choice for approximate inference is
variational inference (VI), where the problem of Bayesian inference is cast as
an optimization problem -- namely, to maximize a lower bound of the log
marginal likelihood. This paves the way for a powerful and versatile framework,
where pseudo-training examples are treated as optimization arguments of the
approximate posterior that are jointly identified together with hyperparameters
of the generative model (i.e. prior and likelihood). The framework can
naturally handle a wide scope of supervised learning problems, ranging from
regression with heteroscedastic and non-Gaussian likelihoods to classification
problems with discrete labels, but also multilabel problems. The purpose of
this tutorial is to provide access to the basic matter for readers without
prior knowledge in both GPs and VI. A proper exposition to the subject enables
also access to more recent advances (like importance-weighted VI as well as
interdomain, multioutput and deep GPs) that can serve as an inspiration for new
research ideas
Surrogate modeling with sequential design for design and analysis of electronic systems
The growing computational demands of modern engineering simulations as used frequently in fields ranging from computational fluid dynamics to electromagnetics, requires methodologies to be able to perform evaluation intensive tasks. Popular analyses include design space exploration, visualization, optimization or sensitivity analysis. This work provides an overview of advancements in surrogate modeling, a data-driven approximation technique. Both sequential design and adaptive modeling are covered, and an integrated platform for surrogate modeling is presented. Finally, a recent technique known as deep Gaussian processes is highlighted as a promising alternative for surrogate modeling of non-stationary response surfaces
Geometric Neural Diffusion Processes
Denoising diffusion models have proven to be a flexible and effective
paradigm for generative modelling. Their recent extension to infinite
dimensional Euclidean spaces has allowed for the modelling of stochastic
processes. However, many problems in the natural sciences incorporate
symmetries and involve data living in non-Euclidean spaces. In this work, we
extend the framework of diffusion models to incorporate a series of geometric
priors in infinite-dimension modelling. We do so by a) constructing a noising
process which admits, as limiting distribution, a geometric Gaussian process
that transforms under the symmetry group of interest, and b) approximating the
score with a neural network that is equivariant w.r.t. this group. We show that
with these conditions, the generative functional model admits the same
symmetry. We demonstrate scalability and capacity of the model, using a novel
Langevin-based conditional sampler, to fit complex scalar and vector fields,
with Euclidean and spherical codomain, on synthetic and real-world weather
data