33 research outputs found
Proposals which speed-up function-space MCMC
Inverse problems lend themselves naturally to a Bayesian formulation, in
which the quantity of interest is a posterior distribution of state and/or
parameters given some uncertain observations. For the common case in which the
forward operator is smoothing, then the inverse problem is ill-posed.
Well-posedness is imposed via regularisation in the form of a prior, which is
often Gaussian. Under quite general conditions, it can be shown that the
posterior is absolutely continuous with respect to the prior and it may be
well-defined on function space in terms of its density with respect to the
prior. In this case, by constructing a proposal for which the prior is
invariant, one can define Metropolis-Hastings schemes for MCMC which are
well-defined on function space, and hence do not degenerate as the dimension of
the underlying quantity of interest increases to infinity, e.g. under mesh
refinement when approximating PDE in finite dimensions. However, in practice,
despite the attractive theoretical properties of the currently available
schemes, they may still suffer from long correlation times, particularly if the
data is very informative about some of the unknown parameters. In fact, in this
case it may be the directions of the posterior which coincide with the (already
known) prior which decorrelate the slowest. The information incorporated into
the posterior through the data is often contained within some
finite-dimensional subspace, in an appropriate basis, perhaps even one defined
by eigenfunctions of the prior. We aim to exploit this fact and improve the
mixing time of function-space MCMC by careful rescaling of the proposal. To
this end, we introduce two new basic methods of increasing complexity,
involving (i) characteristic function truncation of high frequencies and (ii)
hessian information to interpolate between low and high frequencies
Deterministic Mean-field Ensemble Kalman Filtering
The proof of convergence of the standard ensemble Kalman filter (EnKF) from
Legland etal. (2011) is extended to non-Gaussian state space models. A
density-based deterministic approximation of the mean-field limit EnKF
(DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given
a certain minimal order of convergence between the two, this extends
to the deterministic filter approximation, which is therefore asymptotically
superior to standard EnKF when the dimension . The fidelity of
approximation of the true distribution is also established using an extension
of total variation metric to random measures. This is limited by a Gaussian
bias term arising from non-linearity/non-Gaussianity of the model, which exists
for both DMFEnKF and standard EnKF. Numerical results support and extend the
theory
Multilevel Particle Filters for L\'evy-driven stochastic differential equations
We develop algorithms for computing expectations of the laws of models
associated to stochastic differential equations (SDEs) driven by pure L\'evy
processes. We consider filtering such processes and well as pricing of path
dependent options. We propose a multilevel particle filter (MLPF) to address
the computational issues involved in solving these continuum problems. We show
via numerical simulations and theoretical results that under suitable
assumptions of the discretization of the underlying driving L\'evy proccess,
our proposed method achieves optimal convergence rates. The cost to obtain MSE
scales like for our method, as compared with
the standard particle filter
Sparse online variational Bayesian regression
This work considers variational Bayesian inference as an inexpensive and
scalable alternative to a fully Bayesian approach in the context of
sparsity-promoting priors. In particular, the priors considered arise from
scale mixtures of Normal distributions with a generalized inverse Gaussian
mixing distribution. This includes the variational Bayesian LASSO as an
inexpensive and scalable alternative to the Bayesian LASSO introduced in [65].
It also includes a family of priors which more strongly promote sparsity. For
linear models the method requires only the iterative solution of deterministic
least squares problems. Furthermore, for p unknown covariates the method can be
implemented exactly online with a cost of in computation and
in memory per iteration -- in other words, the cost per iteration is
independent of n, and in principle infinite data can be considered. For large
an approximation is able to achieve promising results for a cost of
per iteration, in both computation and memory. Strategies for hyper-parameter
tuning are also considered. The method is implemented for real and simulated
data. It is shown that the performance in terms of variable selection and
uncertainty quantification of the variational Bayesian LASSO can be comparable
to the Bayesian LASSO for problems which are tractable with that method, and
for a fraction of the cost. The present method comfortably handles ,
on a laptop in less than 30 minutes, and , overnight
Determining White Noise Forcing From Eulerian Observations in the Navier Stokes Equation
The Bayesian approach to inverse problems is of paramount importance in
quantifying uncertainty about the input to and the state of a system of
interest given noisy observations. Herein we consider the forward problem of
the forced 2D Navier Stokes equation. The inverse problem is inference of the
forcing, and possibly the initial condition, given noisy observations of the
velocity field. We place a prior on the forcing which is in the form of a
spatially correlated temporally white Gaussian process, and formulate the
inverse problem for the posterior distribution. Given appropriate spatial
regularity conditions, we show that the solution is a continuous function of
the forcing. Hence, for appropriately chosen spatial regularity in the prior,
the posterior distribution on the forcing is absolutely continuous with respect
to the prior and is hence well-defined. Furthermore, the posterior distribution
is a continuous function of the data. We complement this theoretical result
with numerical simulation of the posterior distribution
Multilevel ensemble Kalman filtering for spatio-temporal processes
We design and analyse the performance of a multilevel ensemble Kalman filter
method (MLEnKF) for filtering settings where the underlying state-space model
is an infinite-dimensional spatio-temporal process. We consider underlying
models that needs to be simulated by numerical methods, with discretization in
both space and time. The multilevel Monte Carlo (MLMC) sampling strategy,
achieving variance reduction through pairwise coupling of ensemble particles on
neighboring resolutions, is used in the sample-moment step of MLEnKF to produce
an efficient hierarchical filtering method for spatio-temporal models. Under
sufficient regularity, MLEnKF is proven to be more efficient for weak
approximations than EnKF, asymptotically in the large-ensemble and
fine-numerical-resolution limit. Numerical examples support our theoretical
findings.Comment: Version 1: 39 pages, 4 figures.arXiv admin note: substantial text
overlap with arXiv:1608.08558 . Version 2 (this version): 52 pages, 6
figures. Revision primarily of the introduction and the numerical examples
sectio
A Bayesian analysis of classical shadows
The method of classical shadows heralds unprecedented opportunities for
quantum estimation with limited measurements [H.-Y. Huang, R. Kueng, and J.
Preskill, Nat. Phys. 16, 1050 (2020)]. Yet its relationship to established
quantum tomographic approaches, particularly those based on likelihood models,
remains unclear. In this article, we investigate classical shadows through the
lens of Bayesian mean estimation (BME). In direct tests on numerical data, BME
is found to attain significantly lower error on average, but classical shadows
prove remarkably more accurate in specific situations -- such as high-fidelity
ground truth states -- which are improbable in a fully uniform Hilbert space.
We then introduce an observable-oriented pseudo-likelihood that successfully
emulates the dimension-independence and state-specific optimality of classical
shadows, but within a Bayesian framework that ensures only physical states. Our
research reveals how classical shadows effect important departures from
conventional thinking in quantum state estimation, as well as the utility of
Bayesian methods for uncovering and formalizing statistical assumptions.Comment: 8 pages, 5 figure