16 research outputs found
Sequential Monte Carlo for Graphical Models
We propose a new framework for how to use sequential Monte Carlo (SMC)
algorithms for inference in probabilistic graphical models (PGM). Via a
sequential decomposition of the PGM we find a sequence of auxiliary
distributions defined on a monotonically increasing sequence of probability
spaces. By targeting these auxiliary distributions using SMC we are able to
approximate the full joint distribution defined by the PGM. One of the key
merits of the SMC sampler is that it provides an unbiased estimate of the
partition function of the model. We also show how it can be used within a
particle Markov chain Monte Carlo framework in order to construct
high-dimensional block-sampling algorithms for general PGMs
Nested Sequential Monte Carlo Methods
We propose nested sequential Monte Carlo (NSMC), a methodology to sample from
sequences of probability distributions, even where the random variables are
high-dimensional. NSMC generalises the SMC framework by requiring only
approximate, properly weighted, samples from the SMC proposal distribution,
while still resulting in a correct SMC algorithm. Furthermore, NSMC can in
itself be used to produce such properly weighted samples. Consequently, one
NSMC sampler can be used to construct an efficient high-dimensional proposal
distribution for another NSMC sampler, and this nesting of the algorithm can be
done to an arbitrary degree. This allows us to consider complex and
high-dimensional models using SMC. We show results that motivate the efficacy
of our approach on several filtering problems with dimensions in the order of
100 to 1 000.Comment: Extended version of paper published in Proceedings of the 32nd
International Conference on Machine Learning (ICML), Lille, France, 201
Capacity estimation of two-dimensional channels using Sequential Monte Carlo
We derive a new Sequential-Monte-Carlo-based algorithm to estimate the
capacity of two-dimensional channel models. The focus is on computing the
noiseless capacity of the 2-D one-infinity run-length limited constrained
channel, but the underlying idea is generally applicable. The proposed
algorithm is profiled against a state-of-the-art method, yielding more than an
order of magnitude improvement in estimation accuracy for a given computation
time
Variational Sequential Monte Carlo
Many recent advances in large scale probabilistic inference rely on
variational methods. The success of variational approaches depends on (i)
formulating a flexible parametric family of distributions, and (ii) optimizing
the parameters to find the member of this family that most closely approximates
the exact posterior. In this paper we present a new approximating family of
distributions, the variational sequential Monte Carlo (VSMC) family, and show
how to optimize it in variational inference. VSMC melds variational inference
(VI) and sequential Monte Carlo (SMC), providing practitioners with flexible,
accurate, and powerful Bayesian inference. The VSMC family is a variational
family that can approximate the posterior arbitrarily well, while still
allowing for efficient optimization of its parameters. We demonstrate its
utility on state space models, stochastic volatility models for financial data,
and deep Markov models of brain neural circuits
Neural Diffusion Models
Diffusion models have shown remarkable performance on many generative tasks.
Despite recent success, most diffusion models are restricted in that they only
allow linear transformation of the data distribution. In contrast, broader
family of transformations can potentially help train generative distributions
more efficiently, simplifying the reverse process and closing the gap between
the true negative log-likelihood and the variational approximation. In this
paper, we present Neural Diffusion Models (NDMs), a generalization of
conventional diffusion models that enables defining and learning time-dependent
non-linear transformations of data. We show how to optimise NDMs using a
variational bound in a simulation-free setting. Moreover, we derive a
time-continuous formulation of NDMs, which allows fast and reliable inference
using off-the-shelf numerical ODE and SDE solvers. Finally, we demonstrate the
utility of NDMs with learnable transformations through experiments on standard
image generation benchmarks, including CIFAR-10, downsampled versions of
ImageNet and CelebA-HQ. NDMs outperform conventional diffusion models in terms
of likelihood and produce high-quality samples
Sequential Monte Carlo Methods for System Identification
One of the key challenges in identifying nonlinear and possibly non-Gaussian
state space models (SSMs) is the intractability of estimating the system state.
Sequential Monte Carlo (SMC) methods, such as the particle filter (introduced
more than two decades ago), provide numerical solutions to the nonlinear state
estimation problems arising in SSMs. When combined with additional
identification techniques, these algorithms provide solid solutions to the
nonlinear system identification problem. We describe two general strategies for
creating such combinations and discuss why SMC is a natural tool for
implementing these strategies.Comment: In proceedings of the 17th IFAC Symposium on System Identification
(SYSID). Added cover pag