413 research outputs found
Getting Started with Particle Metropolis-Hastings for Inference in Nonlinear Dynamical Models
This tutorial provides a gentle introduction to the particle
Metropolis-Hastings (PMH) algorithm for parameter inference in nonlinear
state-space models together with a software implementation in the statistical
programming language R. We employ a step-by-step approach to develop an
implementation of the PMH algorithm (and the particle filter within) together
with the reader. This final implementation is also available as the package
pmhtutorial in the CRAN repository. Throughout the tutorial, we provide some
intuition as to how the algorithm operates and discuss some solutions to
problems that might occur in practice. To illustrate the use of PMH, we
consider parameter inference in a linear Gaussian state-space model with
synthetic data and a nonlinear stochastic volatility model with real-world
data.Comment: 41 pages, 7 figures. In press for Journal of Statistical Software.
Source code for R, Python and MATLAB available at:
https://github.com/compops/pmh-tutoria
Constructing Metropolis-Hastings proposals using damped BFGS updates
The computation of Bayesian estimates of system parameters and functions of
them on the basis of observed system performance data is a common problem
within system identification. This is a previously studied issue where
stochastic simulation approaches have been examined using the popular
Metropolis--Hastings (MH) algorithm. This prior study has identified a
recognised difficulty of tuning the {proposal distribution so that the MH
method provides realisations with sufficient mixing to deliver efficient
convergence. This paper proposes and empirically examines a method of tuning
the proposal using ideas borrowed from the numerical optimisation literature
around efficient computation of Hessians so that gradient and curvature
information of the target posterior can be incorporated in the proposal.Comment: 16 pages, 2 figures. Accepted for publication in the Proceedings of
the 18th IFAC Symposium on System Identification (SYSID
Nudging the particle filter
We investigate a new sampling scheme aimed at improving the performance of
particle filters whenever (a) there is a significant mismatch between the
assumed model dynamics and the actual system, or (b) the posterior probability
tends to concentrate in relatively small regions of the state space. The
proposed scheme pushes some particles towards specific regions where the
likelihood is expected to be high, an operation known as nudging in the
geophysics literature. We re-interpret nudging in a form applicable to any
particle filtering scheme, as it does not involve any changes in the rest of
the algorithm. Since the particles are modified, but the importance weights do
not account for this modification, the use of nudging leads to additional bias
in the resulting estimators. However, we prove analytically that nudged
particle filters can still attain asymptotic convergence with the same error
rates as conventional particle methods. Simple analysis also yields an
alternative interpretation of the nudging operation that explains its
robustness to model errors. Finally, we show numerical results that illustrate
the improvements that can be attained using the proposed scheme. In particular,
we present nonlinear tracking examples with synthetic data and a model
inference example using real-world financial data
Designing Proposal Distributions for Particle Filters using Integrated Nested Laplace Approximation
State-space models are used to describe and analyse dynamical systems. They
are ubiquitously used in many scientific fields such as signal processing,
finance and ecology to name a few. Particle filters are popular inferential
methods used for state-space methods. Integrated Nested Laplace Approximation
(INLA), an approximate Bayesian inference method, can also be used for this
kind of models in case the transition distribution is Gaussian. We present a
way to use this framework in order to approximate the particle filter's
proposal distribution that incorporates information about the observations,
parameters and the previous latent variables. Further, we demonstrate the
performance of this proposal on data simulated from a Poisson state-space model
used for count data. We also show how INLA can be used to estimate the
parameters of certain state-space models (a task that is often challenging)
that would be used for Sequential Monte Carlo algorithms.Comment: 13 pages, 4 figure
Sequential Monte Carlo Methods in the nimble and nimbleSMC R Packages
nimble is an R package for constructing algorithms and conducting inference on hierarchical models. The nimble package provides a unique combination of flexible model specification and the ability to program model-generic algorithms. Specifically, the package allows users to code models in the BUGS language, and it allows users to write algorithms that can be applied to any appropriate model. In this paper, we introduce the nimbleSMC R package. nimbleSMC contains algorithms for state-space model analysis using sequential Monte Carlo (SMC) techniques that are built using nimble. We first provide an overview of state-space models and commonly-used SMC algorithms. We then describe how to build a state-space model in nimble and conduct inference using existing SMC algorithms within nimbleSMC. SMC algorithms within nimbleSMC currently include the bootstrap filter, auxiliary particle filter, ensemble Kalman filter, IF2 method of iterated filtering, and a particle Markov chain Monte Carlo (MCMC) sampler. These algorithms can be run in R or compiled into C++ for more efficient execution. Examples of applying SMC algorithms to linear autoregressive models and a stochastic volatility model are provided. Finally, we give an overview of how model-generic algorithms are coded within nimble by providing code for a simple SMC algorithm. This illustrates how users can easily extend nimble's SMC methods in high-level code
Efficient Learning of the Parameters of Non-Linear Models using Differentiable Resampling in Particle Filters
It has been widely documented that the sampling and resampling steps in
particle filters cannot be differentiated. The {\itshape reparameterisation
trick} was introduced to allow the sampling step to be reformulated into a
differentiable function. We extend the {\itshape reparameterisation trick} to
include the stochastic input to resampling therefore limiting the
discontinuities in the gradient calculation after this step. Knowing the
gradients of the prior and likelihood allows us to run particle Markov Chain
Monte Carlo (p-MCMC) and use the No-U-Turn Sampler (NUTS) as the proposal when
estimating parameters.
We compare the Metropolis-adjusted Langevin algorithm (MALA), Hamiltonian
Monte Carlo with different number of steps and NUTS. We consider two
state-space models and show that NUTS improves the mixing of the Markov chain
and can produce more accurate results in less computational time.Comment: 35 pages, 10 figure
- …