1,277 research outputs found
Rates of contraction of posterior distributions based on Gaussian process priors
We derive rates of contraction of posterior distributions on nonparametric or
semiparametric models based on Gaussian processes. The rate of contraction is
shown to depend on the position of the true parameter relative to the
reproducing kernel Hilbert space of the Gaussian process and the small ball
probabilities of the Gaussian process. We determine these quantities for a
range of examples of Gaussian priors and in several statistical settings. For
instance, we consider the rate of contraction of the posterior distribution
based on sampling from a smooth density model when the prior models the log
density as a (fractionally integrated) Brownian motion. We also consider
regression with Gaussian errors and smooth classification under a logistic or
probit link function combined with various priors.Comment: Published in at http://dx.doi.org/10.1214/009053607000000613 the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth
We consider nonparametric Bayesian estimation inference using a rescaled
smooth Gaussian field as a prior for a multidimensional function. The rescaling
is achieved using a Gamma variable and the procedure can be viewed as choosing
an inverse Gamma bandwidth. The procedure is studied from a frequentist
perspective in three statistical settings involving replicated observations
(density estimation, regression and classification). We prove that the
resulting posterior distribution shrinks to the distribution that generates the
data at a speed which is minimax-optimal up to a logarithmic factor, whatever
the regularity level of the data-generating distribution. Thus the hierachical
Bayesian procedure, with a fixed prior, is shown to be fully adaptive.Comment: Published in at http://dx.doi.org/10.1214/08-AOS678 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Rejoinder to discussions of "Frequentist coverage of adaptive nonparametric Bayesian credible sets"
Rejoinder of "Frequentist coverage of adaptive nonparametric Bayesian
credible sets" by Szab\'o, van der Vaart and van Zanten [arXiv:1310.4489v5].Comment: Published at http://dx.doi.org/10.1214/15-AOS1270REJ in the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Frequentist coverage of adaptive nonparametric Bayesian credible sets
We investigate the frequentist coverage of Bayesian credible sets in a
nonparametric setting. We consider a scale of priors of varying regularity and
choose the regularity by an empirical Bayes method. Next we consider a central
set of prescribed posterior probability in the posterior distribution of the
chosen regularity. We show that such an adaptive Bayes credible set gives
correct uncertainty quantification of "polished tail" parameters, in the sense
of high probability of coverage of such parameters. On the negative side, we
show by theory and example that adaptation of the prior necessarily leads to
gross and haphazard uncertainty quantification for some true parameters that
are still within the hyperrectangle regularity scale.Comment: Published at http://dx.doi.org/10.1214/14-AOS1270 in the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Bayesian inverse problems with Gaussian priors
The posterior distribution in a nonparametric inverse problem is shown to
contract to the true parameter at a rate that depends on the smoothness of the
parameter, and the smoothness and scale of the prior. Correct combinations of
these characteristics lead to the minimax rate. The frequentist coverage of
credible sets is shown to depend on the combination of prior and true
parameter, with smoother priors leading to zero coverage and rougher priors to
conservative coverage. In the latter case credible sets are of the correct
order of magnitude. The results are numerically illustrated by the problem of
recovering a function from observation of a noisy version of its primitive.Comment: Published in at http://dx.doi.org/10.1214/11-AOS920 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Three-Dimensional Time-Resolved Trajectories from Laboratory Insect Swarms
Aggregations of animals display complex and dynamic behaviour, both at the individual level and on the level of the group as a whole. Often, this behaviour is collective, so that the group exhibits properties that are distinct from those of the individuals. In insect swarms, the motion of individuals is typically convoluted, and swarms display neither net polarization nor correlation. The swarms themselves, however, remain nearly stationary and maintain their cohesion even in noisy natural environments. This behaviour stands in contrast with other forms of collective animal behaviour, such as flocking, schooling, or herding, where the motion of individuals is more coordinated, and thus swarms provide a powerful way to study the underpinnings of collective behaviour as distinct from global order. Here, we provide a data set of three-dimensional, time-resolved trajectories, including positions, velocities, and accelerations, of individual insects in laboratory insect swarms. The data can be used to study the collective as a whole as well as the dynamics and behaviour of individuals within the swarm
Sequential Data-Adaptive Bandwidth Selection by Cross-Validation for Nonparametric Prediction
We consider the problem of bandwidth selection by cross-validation from a
sequential point of view in a nonparametric regression model. Having in mind
that in applications one often aims at estimation, prediction and change
detection simultaneously, we investigate that approach for sequential kernel
smoothers in order to base these tasks on a single statistic. We provide
uniform weak laws of large numbers and weak consistency results for the
cross-validated bandwidth. Extensions to weakly dependent error terms are
discussed as well. The errors may be {\alpha}-mixing or L2-near epoch
dependent, which guarantees that the uniform convergence of the cross
validation sum and the consistency of the cross-validated bandwidth hold true
for a large class of time series. The method is illustrated by analyzing
photovoltaic data.Comment: 26 page
- …