24,922 research outputs found

    Kernel Sequential Monte Carlo

    Get PDF
    We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities. KSMC is a family of sequential Monte Carlo algorithms that are based on building emulator models of the current particle system in a reproducing kernel Hilbert space. We here focus on modelling nonlinear covariance structure and gradients of the target. The emulator’s geometry is adaptively updated and subsequently used to inform local proposals. Unlike in adaptive Markov chain Monte Carlo, continuous adaptation does not compromise convergence of the sampler. KSMC combines the strengths of sequental Monte Carlo and kernel methods: superior performance for multimodal targets and the ability to estimate model evidence as compared to Markov chain Monte Carlo, and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity. As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive. We describe necessary tuning details and demonstrate the benefits of the the proposed methodology on a series of challenging synthetic and real-world examples

    Kernel Sequential Monte Carlo

    Get PDF
    We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities. KSMC is a family of sequential Monte Carlo algorithms that are based on building emulator models of the current particle system in a reproducing kernel Hilbert space. We here focus on modelling nonlinear covariance structure and gradients of the target. The emulator's geometry is adaptively updated and subsequently used to inform local proposals. Unlike in adaptive Markov chain Monte Carlo, continuous adaptation does not compromise convergence of the sampler. KSMC combines the strengths of sequental Monte Carlo and kernel methods: superior performance for multimodal targets and the ability to estimate model evidence as compared to Markov chain Monte Carlo, and the emulator's ability to represent targets that exhibit high degrees of nonlinearity. As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive. We describe necessary tuning details and demonstrate the benefits of the the proposed methodology on a series of challenging synthetic and real-world examples

    Particle Gibbs with Ancestor Sampling

    Full text link
    Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a novel PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an off-the-shelf class of Markov kernels that can be used to simulate the typically high-dimensional and highly autocorrelated state trajectory in a state-space model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in state-space models, but also in models with more complex dependencies, such as non-Markovian, Bayesian nonparametric, and general probabilistic graphical models

    Approximating multivariate posterior distribution functions from Monte Carlo samples for sequential Bayesian inference

    Full text link
    An important feature of Bayesian statistics is the opportunity to do sequential inference: the posterior distribution obtained after seeing a dataset can be used as prior for a second inference. However, when Monte Carlo sampling methods are used for inference, we only have a set of samples from the posterior distribution. To do sequential inference, we then either have to evaluate the second posterior at only these locations and reweight the samples accordingly, or we can estimate a functional description of the posterior probability distribution from the samples and use that as prior for the second inference. Here, we investigated to what extent we can obtain an accurate joint posterior from two datasets if the inference is done sequentially rather than jointly, under the condition that each inference step is done using Monte Carlo sampling. To test this, we evaluated the accuracy of kernel density estimates, Gaussian mixtures, vine copulas and Gaussian processes in approximating posterior distributions, and then tested whether these approximations can be used in sequential inference. In low dimensionality, Gaussian processes are more accurate, whereas in higher dimensionality Gaussian mixtures or vine copulas perform better. In our test cases, posterior approximations are preferable over direct sample reweighting, although joint inference is still preferable over sequential inference. Since the performance is case-specific, we provide an R package mvdens with a unified interface for the density approximation methods

    A Box Regularized Particle Filter for state estimation with severely ambiguous and non-linear measurements

    Get PDF
    International audienceThe first stage in any control system is to be able to accurately estimate the system's state. However, some types of measurements are ambiguous (non-injective) in terms of state. Existing algorithms for such problems, such as Monte Carlo methods, are computationally expensive or not robust to such ambiguity. We propose the Box Regularized Particle Filter (BRPF) to resolve these problems. Based on previous works on box particle filters, we present a more generic and accurate formulation of the algorithm, with two innovations: a generalized box resampling step and a kernel smoothing method, which is shown to be optimal in terms of Mean Integrated Square Error. Monte Carlo simulations demonstrate the efficiency of BRPF on a severely ambiguous and non-linear estimation problem, that of Terrain Aided Navigation. BRPF is compared to the Sequential Importance Resampling Particle Filter (SIR-PF), Monte Carlo Markov Chain (MCMC), and the original Box Particle Filter (BPF). The algorithm outperforms existing methods in terms of Root Mean Square Error (e.g., improvement up to 42% in geographical position estimation with respect to the BPF) for a large initial uncertainty. The BRPF reduces the computational load by 73% and 90% for SIR-PF and MCMC, respectively, with similar RMSE values. This work offers an accurate (in terms of RMSE) and robust (in terms of divergence rate) way to tackle state estimation from ambiguous measurements while requiring a significantly lower computational load than classic Monte Carlo and particle filtering methods.The first stage in any control system is to be able to accurately estimate the system’s state. However, some types of measurements are ambiguous (non-injective) in terms of state. Existing algorithms for such problems, such as Monte Carlo methods, are computationally expensive or not robust to such ambiguity. We propose the Box Regularized Particle Filter (BRPF) to resolve these problems.Based on previous works on box particle filters, we present a more generic and accurate formulation of the algorithm, with two innovations: a generalized box resampling step and a kernel smoothing method, which is shown to be optimal in terms of Mean Integrated Square Error.Monte Carlo simulations demonstrate the efficiency of BRPF on a severely ambiguous and non-linear estimation problem, the Terrain Aided Navigation. BRPF is compared to the Sequential Importance Resampling Particle Filter (SIR-PF), the Markov Chain Monte Carlo approach (MCMC), and the original Box Particle Filter (BPF). The algorithm is demonstrated to outperform existing methods in terms of Root Mean Square Error (e.g., improvement up to 42% in geographical position estimation with respect to the BPF) for a large initial uncertainty.The BRPF yields a computational load reduction of 73% with respect to the SIR-PF and of 90% with respect to MCMC for similar RMSE orders of magnitude. The present work offers an accurate (in terms of RMSE) and robust (in terms of divergence rate) way to tackle state estimation from ambiguous measurements while requiring a significantly lower computational load than classic Monte Carlo and particle filtering methods
    • …
    corecore