278 research outputs found
Elliptical slice sampling
Many probabilistic models introduce strong dependencies between variables
using a latent multivariate Gaussian distribution or a Gaussian process. We
present a new Markov chain Monte Carlo algorithm for performing inference in
models with multivariate Gaussian priors. Its key properties are: 1) it has
simple, generic code applicable to many models, 2) it has no free parameters,
3) it works well for a variety of Gaussian process based models. These
properties make our method ideal for use while model building, removing the
need to spend time deriving and tuning updates for more complex algorithms.Comment: 8 pages, 6 figures, appearing in AISTATS 2010 (JMLR: W&CP volume 6).
Differences from first submission: some minor edits in response to feedback
Transport Elliptical Slice Sampling
We propose a new framework for efficiently sampling from complex probability
distributions using a combination of normalizing flows and elliptical slice
sampling (Murray et al., 2010). The central idea is to learn a diffeomorphism,
through normalizing flows, that maps the non-Gaussian structure of the target
distribution to an approximately Gaussian distribution. We then use the
elliptical slice sampler, an efficient and tuning-free Markov chain Monte Carlo
(MCMC) algorithm, to sample from the transformed distribution. The samples are
then pulled back using the inverse normalizing flow, yielding samples that
approximate the stationary target distribution of interest. Our transport
elliptical slice sampler (TESS) is optimized for modern computer architectures,
where its adaptation mechanism utilizes parallel cores to rapidly run multiple
Markov chains for a few iterations. Numerical demonstrations show that TESS
produces Monte Carlo samples from the target distribution with lower
autocorrelation compared to non-transformed samplers, and demonstrates
significant improvements in efficiency when compared to gradient-based
proposals designed for parallel computer architectures, given a flexible enough
diffeomorphism
Parallel MCMC with Generalized Elliptical Slice Sampling
Probabilistic models are conceptually powerful tools for finding structure in
data, but their practical effectiveness is often limited by our ability to
perform inference in them. Exact inference is frequently intractable, so
approximate inference is often performed using Markov chain Monte Carlo (MCMC).
To achieve the best possible results from MCMC, we want to efficiently simulate
many steps of a rapidly mixing Markov chain which leaves the target
distribution invariant. Of particular interest in this regard is how to take
advantage of multi-core computing to speed up MCMC-based inference, both to
improve mixing and to distribute the computational load. In this paper, we
present a parallelizable Markov chain Monte Carlo algorithm for efficiently
sampling from continuous probability distributions that can take advantage of
hundreds of cores. This method shares information between parallel Markov
chains to build a scale-mixture of Gaussians approximation to the density
function of the target distribution. We combine this approximation with a
recent method known as elliptical slice sampling to create a Markov chain with
no step-size parameters that can mix rapidly without requiring gradient or
curvature computations.Comment: 19 pages, 8 figures, 3 algorithm
Action classification using a discriminative non-parametric hidden Markov model
We classify human actions occurring in videos, using the skeletal joint positions extracted from a depth image sequence as features. Each action class is represented by a non-parametric Hidden Markov Model (NP-HMM) and the model parameters are learnt in a discriminative way. Specifically, we use a Bayesian framework based on Hierarchical Dirichlet Process (HDP) to automatically infer the cardinality of hidden states and formulate a discriminative function based on distance between Gaussian distributions to improve classification performance. We use elliptical slice sampling to efficiently sample parameters from the complex posterior distribution induced by our discriminative likelihood function. We illustrate our classification results for action class models trained using this technique
Geometric convergence of slice sampling
In Bayesian statistics sampling w.r.t. a posterior distribution, which is given through a prior and a likelihood function, is a challenging task. The generation of exact samples is in general quite difficult, since the posterior distribution is often known only up to a normalizing constant. A standard way to approach this problem is a Markov chain Monte Carlo (MCMC) algorithm for approximate sampling w.r.t. the target distribution. In this cumulative dissertation geometric convergence guarantees are given for two different MCMC methods: simple slice sampling and elliptical
slice sampling.2021-10-2
- âŠ