107 research outputs found

    Particle Gibbs with Ancestor Sampling

    Full text link
    Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a novel PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an off-the-shelf class of Markov kernels that can be used to simulate the typically high-dimensional and highly autocorrelated state trajectory in a state-space model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in state-space models, but also in models with more complex dependencies, such as non-Markovian, Bayesian nonparametric, and general probabilistic graphical models

    Identification of a Duffing oscillator using particle Gibbs with ancestor sampling

    Get PDF
    The Duffing oscillator remains a key benchmark in nonlinear systems analysis and poses interesting challenges in nonlinear structural identification. The use of particle methods or sequential Monte Carlo (SMC) is becoming a more common approach for tackling these nonlinear dynamical systems, within structural dynamics and beyond. This paper demonstrates the use of a tailored SMC algorithm within a Markov Chain Monte Carlo (MCMC) scheme to allow inference over the latent states and parameters of the Duffing oscillator in a Bayesian manner. This approach to system identification offers a statistically more rigorous treatment of the problem than the common state-augmentation methods where the parameters of the model are included as additional latent states. It is shown how recent advances in particle MCMC methods, namely the particle Gibbs with ancestor sampling (PG-AS) algorithm is capable of performing efficient Bayesian inference, even in cases where little is known about the system parameters a priori. The advantage of this Bayesian approach is the quantification of uncertainty, not only in the system parameters but also in the states of the model (displacement and velocity) even in the presence of measurement noise

    Infinite Factorial Finite State Machine for Blind Multiuser Channel Estimation

    Full text link
    New communication standards need to deal with machine-to-machine communications, in which users may start or stop transmitting at any time in an asynchronous manner. Thus, the number of users is an unknown and time-varying parameter that needs to be accurately estimated in order to properly recover the symbols transmitted by all users in the system. In this paper, we address the problem of joint channel parameter and data estimation in a multiuser communication channel in which the number of transmitters is not known. For that purpose, we develop the infinite factorial finite state machine model, a Bayesian nonparametric model based on the Markov Indian buffet that allows for an unbounded number of transmitters with arbitrary channel length. We propose an inference algorithm that makes use of slice sampling and particle Gibbs with ancestor sampling. Our approach is fully blind as it does not require a prior channel estimation step, prior knowledge of the number of transmitters, or any signaling information. Our experimental results, loosely based on the LTE random access channel, show that the proposed approach can effectively recover the data-generating process for a wide range of scenarios, with varying number of transmitters, number of receivers, constellation order, channel length, and signal-to-noise ratio.Comment: 15 pages, 15 figure

    Parameter elimination in particle Gibbs sampling

    Full text link
    Bayesian inference in state-space models is challenging due to high-dimensional state trajectories. A viable approach is particle Markov chain Monte Carlo, combining MCMC and sequential Monte Carlo to form "exact approximations" to otherwise intractable MCMC methods. The performance of the approximation is limited to that of the exact method. We focus on particle Gibbs and particle Gibbs with ancestor sampling, improving their performance beyond that of the underlying Gibbs sampler (which they approximate) by marginalizing out one or more parameters. This is possible when the parameter prior is conjugate to the complete data likelihood. Marginalization yields a non-Markovian model for inference, but we show that, in contrast to the general case, this method still scales linearly in time. While marginalization can be cumbersome to implement, recent advances in probabilistic programming have enabled its automation. We demonstrate how the marginalized methods are viable as efficient inference backends in probabilistic programming, and demonstrate with examples in ecology and epidemiology

    Particle Gibbs with Ancestor Sampling Methods for Unobserved Component Time Series Models with Heavy Tails, Serial Dependence and Structural Breaks

    Get PDF
    Particle Gibbs with ancestor sampling (PG-AS) is a new tool in the family of sequential Monte Carlo methods. We apply PG-AS to the challenging class of unobserved component time series models and demonstrate its flexibility under different circumstances. We also combine discrete structural breaks within the unobserved component model framework. We do this by modeling and forecasting time series characteristics of postwar US inflation using a long memory autoregressive fractionally integrated moving average model with stochastic volatility where we allow for structural breaks in the level, long and short memory parameters contemporaneously with breaks in the level, persistence and the conditional volatility of the volatility of inflation
    • …
    corecore