725 research outputs found
SMCTC : sequential Monte Carlo in C++
Sequential Monte Carlo methods are a very general class of Monte Carlo methods for sampling from sequences of distributions. Simple examples of these algorithms are used very widely in the tracking and signal processing literature. Recent developments illustrate that these techniques have much more general applicability, and can be applied very effectively to statistical inference problems. Unfortunately, these methods are often perceived as being computationally expensive and difficult to implement. This article seeks to address both of these problems. A C++ template class library for the efficient and convenient implementation of very general Sequential Monte Carlo algorithms is presented. Two example applications are provided: a simple particle filter for illustrative purposes and a state-of-the-art algorithm for rare event estimation
SMCTC: Sequential Monte Carlo in C++
Sequential Monte Carlo methods are a very general class of Monte Carlo methods for sampling from sequences of distributions. Simple examples of these algorithms are used very widely in the tracking and signal processing literature. Recent developments illustrate that these techniques have much more general applicability, and can be applied very effectively to statistical inference problems. Unfortunately, these methods are often perceived as being computationally expensive and difficult to implement. This article seeks to address both of these problems. A C++ template class library for the efficient and convenient implementation of very general Sequential Monte Carlo algorithms is presented. Two example applications are provided: a simple particle filter for illustrative purposes and a state-of-the-art algorithm for rare event estimation.
Pointwise Convergence in Probability of General Smoothing Splines
Establishing the convergence of splines can be cast as a variational problem
which is amenable to a -convergence approach. We consider the case in
which the regularization coefficient scales with the number of observations,
, as . Using standard theorems from the
-convergence literature, we prove that the general spline model is
consistent in that estimators converge in a sense slightly weaker than weak
convergence in probability for . Without further assumptions
we show this rate is sharp. This differs from rates for strong convergence
using Hilbert scales where one can often choose
Convergence and Rates for Fixed-Interval Multiple-Track Smoothing Using -Means Type Optimization
We address the task of estimating multiple trajectories from unlabeled data.
This problem arises in many settings, one could think of the construction of
maps of transport networks from passive observation of travellers, or the
reconstruction of the behaviour of uncooperative vehicles from external
observations, for example. There are two coupled problems. The first is a data
association problem: how to map data points onto individual trajectories. The
second is, given a solution to the data association problem, to estimate those
trajectories. We construct estimators as a solution to a regularized
variational problem (to which approximate solutions can be obtained via the
simple, efficient and widespread -means method) and show that, as the number
of data points, , increases, these estimators exhibit stable behaviour. More
precisely, we show that they converge in an appropriate Sobolev space in
probability and with rate
A Simple Approach to Maximum Intractable Likelihood Estimation
Approximate Bayesian Computation (ABC) can be viewed as an analytic
approximation of an intractable likelihood coupled with an elementary
simulation step. Such a view, combined with a suitable instrumental prior
distribution permits maximum-likelihood (or maximum-a-posteriori) inference to
be conducted, approximately, using essentially the same techniques. An
elementary approach to this problem which simply obtains a nonparametric
approximation of the likelihood surface which is then used as a smooth proxy
for the likelihood in a subsequent maximisation step is developed here and the
convergence of this class of algorithms is characterised theoretically. The use
of non-sufficient summary statistics in this context is considered. Applying
the proposed method to four problems demonstrates good performance. The
proposed approach provides an alternative for approximating the maximum
likelihood estimator (MLE) in complex scenarios
The Cuntz splice does not preserve -isomorphism of Leavitt path algebras over
We show that the Leavitt path algebras and
are not isomorphic as -algebras. There are two key
ingredients in the proof. One is a partial algebraic translation of Matsumoto
and Matui's result on diagonal preserving isomorphisms of Cuntz--Krieger
algebras. The other is a complete description of the projections in
for a finite graph. This description is based on a
generalization, due to Chris Smith, of the description of the unitaries in
given by Brownlowe and the second named author. The
techniques generalize to a slightly larger class of rings than just
.Comment: 17 pages. Since version 2 we extended the arguments from Z to more
general ring
The iterated auxiliary particle filter
We present an offline, iterated particle filter to facilitate statistical
inference in general state space hidden Markov models. Given a model and a
sequence of observations, the associated marginal likelihood L is central to
likelihood-based inference for unknown statistical parameters. We define a
class of "twisted" models: each member is specified by a sequence of positive
functions psi and has an associated psi-auxiliary particle filter that provides
unbiased estimates of L. We identify a sequence psi* that is optimal in the
sense that the psi*-auxiliary particle filter's estimate of L has zero
variance. In practical applications, psi* is unknown so the psi*-auxiliary
particle filter cannot straightforwardly be implemented. We use an iterative
scheme to approximate psi*, and demonstrate empirically that the resulting
iterated auxiliary particle filter significantly outperforms the bootstrap
particle filter in challenging settings. Applications include parameter
estimation using a particle Markov chain Monte Carlo algorithm
Maximum likelihood parameter estimation for latent variable models using sequential Monte Carlo
We present a sequential Monte Carlo (SMC) method for maximum
likelihood (ML) parameter estimation in latent variable models. Standard
methods rely on gradient algorithms such as the Expectation-
Maximization (EM) algorithm and its Monte Carlo variants. Our
approach is different and motivated by similar considerations to simulated
annealing (SA); that is we propose to sample from a sequence
of artificial distributions whose support concentrates itself on the set
of ML estimates. To achieve this we use SMC methods. We conclude
by presenting simulation results on a toy problem and a nonlinear
non-Gaussian time series model
On blocks, tempering and particle MCMC for systems identification
The widespread use of particle methods for addressing the filtering and smoothing problems in state-space models has, in recent years, been complemented by the development of particle Markov Chain Monte Carlo (PMCMC) methods. PMCMC uses particle filters within offline systems-identification settings. We develop a modified particle filter, based around block sampling and tempering, intended to improve their exploration of the state space and the associated estimation of the marginal likelihood. The aim is to develop particle methods with improved robustness properties, particularly for parameter values which are not able to explain observed data well, for use within PMCMC algorithms. The proposed strategies do not require a substantial analytic understanding of the model structure, unlike most techniques for improving particle-filter performance
- …