191 research outputs found

    A statistical approach to the inverse problem in magnetoencephalography

    Full text link
    Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic field outside the human head produced by the electrical activity inside the brain. The MEG inverse problem, identifying the location of the electrical sources from the magnetic signal measurements, is ill-posed, that is, there are an infinite number of mathematically correct solutions. Common source localization methods assume the source does not vary with time and do not provide estimates of the variability of the fitted model. Here, we reformulate the MEG inverse problem by considering time-varying locations for the sources and their electrical moments and we model their time evolution using a state space model. Based on our predictive model, we investigate the inverse problem by finding the posterior source distribution given the multiple channels of observations at each time rather than fitting fixed source parameters. Our new model is more realistic than common models and allows us to estimate the variation of the strength, orientation and position. We propose two new Monte Carlo methods based on sequential importance sampling. Unlike the usual MCMC sampling scheme, our new methods work in this situation without needing to tune a high-dimensional transition kernel which has a very high cost. The dimensionality of the unknown parameters is extremely large and the size of the data is even larger. We use Parallel Virtual Machine (PVM) to speed up the computation.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS716 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Embedding Population Dynamics Models in Inference

    Full text link
    Increasing pressures on the environment are generating an ever-increasing need to manage animal and plant populations sustainably, and to protect and rebuild endangered populations. Effective management requires reliable mathematical models, so that the effects of management action can be predicted, and the uncertainty in these predictions quantified. These models must be able to predict the response of populations to anthropogenic change, while handling the major sources of uncertainty. We describe a simple ``building block'' approach to formulating discrete-time models. We show how to estimate the parameters of such models from time series of data, and how to quantify uncertainty in those estimates and in numbers of individuals of different types in populations, using computer-intensive Bayesian methods. We also discuss advantages and pitfalls of the approach, and give an example using the British grey seal population.Comment: Published at http://dx.doi.org/10.1214/088342306000000673 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    An Economist´s guide to the Kalman filter

    Get PDF
    Almost since its appearance, the Kalman Filter (KF) has been successfully used in control engineering. Unfortunately, most of its important results have been published in engineering journals with language, notation and style proper of engineers. In this paper, we want to present the KF in an attractive way to economists by using information theory and Bayesian inference.

    Introducing shrinkage in heavy-tailed state space models to predict equity excess returns

    Full text link
    We forecast S&P 500 excess returns using a flexible Bayesian econometric state space model with non-Gaussian features at several levels. More precisely, we control for overparameterization via novel global-local shrinkage priors on the state innovation variances as well as the time-invariant part of the state space model. The shrinkage priors are complemented by heavy tailed state innovations that cater for potential large breaks in the latent states. Moreover, we allow for leptokurtic stochastic volatility in the observation equation. The empirical findings indicate that several variants of the proposed approach outperform typical competitors frequently used in the literature, both in terms of point and density forecasts

    Estimating Nonlinear Dynamic Equilibrium economies: A Likelihood Approach

    Get PDF
    This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. We develop a Sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non-normal shocks. We show consistency of the estimate and its good performance in finite simulations. This new algorithm is important because the existing empirical literature that wanted to follow a likelihood approach was limited to the estimation of linear models with Gaussian innovations. We apply our procedure to estimate the structural parameters of the neoclassical growth model.Likelihood-Based Inference, Dynamic Equilibrium Economies, Nonlinear Filtering, Sequential Monte Carlo)

    Inference for Adaptive Time Series Models: Stochastic Volatility and Conditionally Gaussian State Space Form

    Get PDF
    In this paper we replace the Gaussian errors in the standard Gaussian, linear state space model with stochastic volatility processes. This is called a GSSF-SV model. We show that conventional MCMC algoritms for this type of model are ineffective, but that this problem can be removed by reparameterising the model. We illustrate our results on an example from financial economics and one from the nonparametric regression literature. We also develop an effective particle filter for this model which is useful to assess the fit of the model.Markov chain Monte Carlo, particle filter, cubic spline, state space form, stochastic volatility.

    Testing the hockey-stick hypothesis by statistical analyses of a large dataset of proxy records.

    Get PDF
    This paper is a statistical time-series investigation addressed at testing the anthropogenic climate change hypothesis known as the “hockey-stick”. The time-series components of a select batch of 258 long-term yearly Climate Change Proxies (CCP) included in 19 paleoclimate datasets, all of which running back as far as the year 2192 B.C., are reconstructed by means of univariate Bayesian Calibration. The instrumental temperature record utilized is the Global Best Estimated Anomaly (BEA) of the HADCRUT4 time series readings available yearly for the period 1850-2010. After performing appropriate data transformations, Ordinary Least Squares parameter estimates are obtained, and subsequently simulated by means of multi-draw Gibbs sampling for each year of the pre-1850 period. The ensuing Time-Varying Parameter sequence is utilized to produce high-resolution calibrated estimates of the CCP series, merged with BEA to yield Millennial-scale Time Series (MTS). Finally, the MTS are individually tested for temperature single break date and multiple peak dates. As a result, the estimated temperature breaks and peaks suggest widespread rejection of the hockey-stick hypothesis since they are mostly centered in the Medieval Warm Period

    Testing the hockey-stick hypothesis by statistical analyses of a large dataset of proxy records.

    Get PDF
    This paper is a statistical time-series investigation addressed at testing the anthropogenic climate change hypothesis known as the “hockey-stick”. The time-series components of a select batch of 258 long-term yearly Climate Change Proxies (CCP) included in 19 paleoclimate datasets, all of which running back as far as the year 2192 B.C., are reconstructed by means of univariate Bayesian Calibration. The instrumental temperature record utilized is the Global Best Estimated Anomaly (BEA) of the HADCRUT4 time series readings available yearly for the period 1850-2010. After performing appropriate data transformations, Ordinary Least Squares parameter estimates are obtained, and subsequently simulated by means of multi-draw Gibbs sampling for each year of the pre-1850 period. The ensuing Time-Varying Parameter sequence is utilized to produce high-resolution calibrated estimates of the CCP series, merged with BEA to yield Millennial-scale Time Series (MTS). Finally, the MTS are individually tested for temperature single break date and multiple peak dates. As a result, the estimated temperature breaks and peaks suggest widespread rejection of the hockey-stick hypothesis since they are mostly centered in the Medieval Warm Period

    Estimating Macroeconomic Models: A Likelihood Approach

    Get PDF
    This paper shows how particle filtering allows us to undertake likelihood-based inference in dynamic macroeconomic models. The models can be nonlinear and/or non-normal. We describe how to use the output from the particle filter to estimate the structural parameters of the model, those characterizing preferences and technology, and to compare different economies. Both tasks can be implemented from either a classical or a Bayesian perspective. We illustrate the technique by estimating a business cycle model with investment-specific technological change, preference shocks, and stochastic volatility.
    corecore