Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining
the two main tools used for Monte Carlo statistical inference: sequential Monte
Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a novel PMCMC
algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS).
PGAS provides the data analyst with an off-the-shelf class of Markov kernels
that can be used to simulate the typically high-dimensional and highly
autocorrelated state trajectory in a state-space model. The ancestor sampling
procedure enables fast mixing of the PGAS kernel even when using seemingly few
particles in the underlying SMC sampler. This is important as it can
significantly reduce the computational burden that is typically associated with
using SMC. PGAS is conceptually similar to the existing PG with backward
simulation (PGBS) procedure. Instead of using separate forward and backward
sweeps as in PGBS, however, we achieve the same effect in a single forward
sweep. This makes PGAS well suited for addressing inference problems not only
in state-space models, but also in models with more complex dependencies, such
as non-Markovian, Bayesian nonparametric, and general probabilistic graphical
models