8,680 research outputs found
The Bayesian Analysis of Complex, High-Dimensional Models: Can It Be CODA?
We consider the Bayesian analysis of a few complex, high-dimensional models
and show that intuitive priors, which are not tailored to the fine details of
the model and the estimated parameters, produce estimators which perform poorly
in situations in which good, simple frequentist estimators exist. The models we
consider are: stratified sampling, the partial linear model, linear and
quadratic functionals of white noise and estimation with stopping times. We
present a strong version of Doob's consistency theorem which demonstrates that
the existence of a uniformly -consistent estimator ensures that the
Bayes posterior is -consistent for values of the parameter in subsets
of prior probability 1. We also demonstrate that it is, at least, in principle,
possible to construct Bayes priors giving both global and local minimax rates,
using a suitable combination of loss functions. We argue that there is no
contradiction in these apparently conflicting findings.Comment: Published in at http://dx.doi.org/10.1214/14-STS483 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
The Hodrick-Prescott (HP) Filter as a Bayesian Regression Model
The Hodrick-Prescott (HP) method is a popular smoothing method for economic time series to get a smooth or long-term component of stationary series like growth rates. We show that the HP smoother can be viewed as a Bayesian linear model with a strong prior using differencing matrices for the smoothness component. The HP smoothing approach requires a linear regression model with a Bayesian conjugate multi-normalgamma distribution. The Bayesian approach also allows to make predictions of the HP smoother on both ends of the time series. Furthermore, we show how Bayes tests can determine the order of smoothness in the HP smoothing model. The extended HP smoothing approach is demonstrated for the non-stationary (textbook) airline passenger time series. Thus, the Bayesian extension of the HP model defines a new class of model-based smoothers for (non-stationary) time series and spatial models.Hodrick-Prescott (HP) smoothers, model selection by marginal likelihoods, multi-normal-gamma distribution, Spatial sales growth data, Bayesian econometrics
Bayesian evidence and predictivity of the inflationary paradigm
In this paper we consider the issue of paradigm evaluation by applying Bayes'
theorem along the following nested hierarchy of progressively more complex
structures: i) parameter estimation (within a model), ii) model selection and
comparison (within a paradigm), iii) paradigm evaluation. In such a hierarchy
the Bayesian evidence works both as the posterior's normalization at a given
level and as the likelihood function at the next level up. Whilst raising no
objections to the standard application of the procedure at the two lowest
levels, we argue that it should receive a considerable modification when
evaluating paradigms, when testability and fitting data are equally important.
By considering toy models we illustrate how models and paradigms that are
difficult to falsify are always favoured by the Bayes factor. We argue that the
evidence for a paradigm should not only be high for a given dataset, but
exceptional with respect to what it would have been, had the data been
different. With this motivation we propose a measure which we term
predictivity, as well as a prior to be incorporated into the Bayesian
framework, penalising unpredictivity as much as not fitting data. We apply this
measure to inflation seen as a whole, and to a scenario where a specific
inflationary model is hypothetically deemed as the only one viable as a result
of information alien to cosmology (e.g. Solar System gravity experiments, or
particle physics input). We conclude that cosmic inflation is currently hard to
falsify, but that this could change were external/additional information to
cosmology to select one of its many models. We also compare this state of
affairs to bimetric varying speed of light cosmology.Comment: Final version with corrections adde
Adaptive Bernstein-von Mises theorems in Gaussian white noise
We investigate Bernstein-von Mises theorems for adaptive nonparametric
Bayesian procedures in the canonical Gaussian white noise model. We consider
both a Hilbert space and multiscale setting with applications in and
respectively. This provides a theoretical justification for plug-in
procedures, for example the use of certain credible sets for sufficiently
smooth linear functionals. We use this general approach to construct optimal
frequentist confidence sets based on the posterior distribution. We also
provide simulations to numerically illustrate our approach and obtain a visual
representation of the geometries involved.Comment: 48 pages, 5 figure
The Extended Hodrick-Prescott (HP) Filter for Spatial Regression Smoothing
The Hodrick-Prescott (HP) method is a popular smoothing method for economic time series to get a longterm component of stationary series like growth rates. The new extended HP smoothing model is applied to data-sets with an underlying metric and requires a Bayesian linear regression model with a strong prior based on differencing matrices for the smoothness parameter and a weak prior for the regression part. We define a Bayesian spatial smoothing model with neighbors for each observation and we define a smoothness prior similar to the HP filter in time series. This opens a new approach to model-based smoothers for time series and spatial models based on MCMC. We apply it to the NUTS-2 regions of the European Union for regional GDP and GDP per capita, where the fixed effects are removed by an extended HP smoothing model.Hodrick-Prescott (HP) smoothers, smoothed square loss function, spatial smoothing, smoothness prior, Bayesian econometrics
- β¦