3,969 research outputs found

    The iterated auxiliary particle filter

    Get PDF
    We present an offline, iterated particle filter to facilitate statistical inference in general state space hidden Markov models. Given a model and a sequence of observations, the associated marginal likelihood L is central to likelihood-based inference for unknown statistical parameters. We define a class of "twisted" models: each member is specified by a sequence of positive functions psi and has an associated psi-auxiliary particle filter that provides unbiased estimates of L. We identify a sequence psi* that is optimal in the sense that the psi*-auxiliary particle filter's estimate of L has zero variance. In practical applications, psi* is unknown so the psi*-auxiliary particle filter cannot straightforwardly be implemented. We use an iterative scheme to approximate psi*, and demonstrate empirically that the resulting iterated auxiliary particle filter significantly outperforms the bootstrap particle filter in challenging settings. Applications include parameter estimation using a particle Markov chain Monte Carlo algorithm

    Bootstrap prediction mean squared errors of unobserved states based on the Kalman filter with estimated parameters

    Get PDF
    Prediction intervals in State Space models can be obtained by assuming Gaussian innovations and using the prediction equations of the Kalman filter, where the true parameters are substituted by consistent estimates. This approach has two limitations. First, it does not incorporate the uncertainty due to parameter estimation. Second, the Gaussianity assumption of future innovations may be inaccurate. To overcome these drawbacks, Wall and Stoffer (2002) propose to obtain prediction intervals by using a bootstrap procedure that requires the backward representation of the model. Obtaining this representation increases the complexity of the procedure and limits its implementation to models for which it exists. The bootstrap procedure proposed by Wall and Stoffer (2002) is further complicated by fact that the intervals are obtained for the prediction errors instead of for the observations. In this paper, we propose a bootstrap procedure for constructing prediction intervals in State Space models that does not need the backward representation of the model and is based on obtaining the intervals directly for the observations. Therefore, its application is much simpler, without loosing the good behavior of bootstrap prediction intervals. We study its finite sample properties and compare them with those of the standard and the Wall and Stoffer (2002) procedures for the Local Level Model. Finally, we illustrate the results by implementing the new procedure to obtain prediction intervals for future values of a real time series

    Adaptive Quantizers for Estimation

    Full text link
    In this paper, adaptive estimation based on noisy quantized observations is studied. A low complexity adaptive algorithm using a quantizer with adjustable input gain and offset is presented. Three possible scalar models for the parameter to be estimated are considered: constant, Wiener process and Wiener process with deterministic drift. After showing that the algorithm is asymptotically unbiased for estimating a constant, it is shown, in the three cases, that the asymptotic mean squared error depends on the Fisher information for the quantized measurements. It is also shown that the loss of performance due to quantization depends approximately on the ratio of the Fisher information for quantized and continuous measurements. At the end of the paper the theoretical results are validated through simulation under two different classes of noise, generalized Gaussian noise and Student's-t noise
    • …
    corecore