18,120 research outputs found
Approximate Bayesian Computation in State Space Models
A new approach to inference in state space models is proposed, based on
approximate Bayesian computation (ABC). ABC avoids evaluation of the likelihood
function by matching observed summary statistics with statistics computed from
data simulated from the true process; exact inference being feasible only if
the statistics are sufficient. With finite sample sufficiency unattainable in
the state space setting, we seek asymptotic sufficiency via the maximum
likelihood estimator (MLE) of the parameters of an auxiliary model. We prove
that this auxiliary model-based approach achieves Bayesian consistency, and
that - in a precise limiting sense - the proximity to (asymptotic) sufficiency
yielded by the MLE is replicated by the score. In multiple parameter settings a
separate treatment of scalar parameters, based on integrated likelihood
techniques, is advocated as a way of avoiding the curse of dimensionality. Some
attention is given to a structure in which the state variable is driven by a
continuous time process, with exact inference typically infeasible in this case
as a result of intractable transitions. The ABC method is demonstrated using
the unscented Kalman filter as a fast and simple way of producing an
approximation in this setting, with a stochastic volatility model for financial
returns used for illustration
Bootstrap confidence sets under model misspecification
A multiplier bootstrap procedure for construction of likelihood-based
confidence sets is considered for finite samples and a possible model
misspecification. Theoretical results justify the bootstrap validity for a
small or moderate sample size and allow to control the impact of the parameter
dimension : the bootstrap approximation works if is small. The main
result about bootstrap validity continues to apply even if the underlying
parametric model is misspecified under the so-called small modelling bias
condition. In the case when the true model deviates significantly from the
considered parametric family, the bootstrap procedure is still applicable but
it becomes a bit conservative: the size of the constructed confidence sets is
increased by the modelling bias. We illustrate the results with numerical
examples for misspecified linear and logistic regressions.Comment: Published at http://dx.doi.org/10.1214/15-AOS1355 in the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
An investigation of data compression techniques for hyperspectral core imager data
We investigate algorithms for tractable analysis of real hyperspectral image data from core samples provided by AngloGold Ashanti. In particular, we investigate feature extraction, non-linear dimension reduction using diffusion maps and wavelet approximation methods on our data
- …