11,666 research outputs found
Efficient delay-tolerant particle filtering
This paper proposes a novel framework for delay-tolerant particle filtering
that is computationally efficient and has limited memory requirements. Within
this framework the informativeness of a delayed (out-of-sequence) measurement
(OOSM) is estimated using a lightweight procedure and uninformative
measurements are immediately discarded. The framework requires the
identification of a threshold that separates informative from uninformative;
this threshold selection task is formulated as a constrained optimization
problem, where the goal is to minimize tracking error whilst controlling the
computational requirements. We develop an algorithm that provides an
approximate solution for the optimization problem. Simulation experiments
provide an example where the proposed framework processes less than 40% of all
OOSMs with only a small reduction in tracking accuracy
Evaluating Data Assimilation Algorithms
Data assimilation leads naturally to a Bayesian formulation in which the
posterior probability distribution of the system state, given the observations,
plays a central conceptual role. The aim of this paper is to use this Bayesian
posterior probability distribution as a gold standard against which to evaluate
various commonly used data assimilation algorithms.
A key aspect of geophysical data assimilation is the high dimensionality and
low predictability of the computational model. With this in mind, yet with the
goal of allowing an explicit and accurate computation of the posterior
distribution, we study the 2D Navier-Stokes equations in a periodic geometry.
We compute the posterior probability distribution by state-of-the-art
statistical sampling techniques. The commonly used algorithms that we evaluate
against this accurate gold standard, as quantified by comparing the relative
error in reproducing its moments, are 4DVAR and a variety of sequential
filtering approximations based on 3DVAR and on extended and ensemble Kalman
filters.
The primary conclusions are that: (i) with appropriate parameter choices,
approximate filters can perform well in reproducing the mean of the desired
probability distribution; (ii) however they typically perform poorly when
attempting to reproduce the covariance; (iii) this poor performance is
compounded by the need to modify the covariance, in order to induce stability.
Thus, whilst filters can be a useful tool in predicting mean behavior, they
should be viewed with caution as predictors of uncertainty. These conclusions
are intrinsic to the algorithms and will not change if the model complexity is
increased, for example by employing a smaller viscosity, or by using a detailed
NWP model
Sequential Monte Carlo smoothing with application to parameter estimation in non-linear state space models
This paper concerns the use of sequential Monte Carlo methods (SMC) for
smoothing in general state space models. A well-known problem when applying the
standard SMC technique in the smoothing mode is that the resampling mechanism
introduces degeneracy of the approximation in the path space. However, when
performing maximum likelihood estimation via the EM algorithm, all functionals
involved are of additive form for a large subclass of models. To cope with the
problem in this case, a modification of the standard method (based on a
technique proposed by Kitagawa and Sato) is suggested. Our algorithm relies on
forgetting properties of the filtering dynamics and the quality of the
estimates produced is investigated, both theoretically and via simulations.Comment: Published in at http://dx.doi.org/10.3150/07-BEJ6150 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- …