22,555 research outputs found
Bayes and empirical Bayes: do they merge?
Bayesian inference is attractive for its coherence and good frequentist
properties. However, it is a common experience that eliciting a honest prior
may be difficult and, in practice, people often take an {\em empirical Bayes}
approach, plugging empirical estimates of the prior hyperparameters into the
posterior distribution. Even if not rigorously justified, the underlying idea
is that, when the sample size is large, empirical Bayes leads to "similar"
inferential answers. Yet, precise mathematical results seem to be missing. In
this work, we give a more rigorous justification in terms of merging of Bayes
and empirical Bayes posterior distributions. We consider two notions of
merging: Bayesian weak merging and frequentist merging in total variation.
Since weak merging is related to consistency, we provide sufficient conditions
for consistency of empirical Bayes posteriors. Also, we show that, under
regularity conditions, the empirical Bayes procedure asymptotically selects the
value of the hyperparameter for which the prior mostly favors the "truth".
Examples include empirical Bayes density estimation with Dirichlet process
mixtures.Comment: 27 page
Asymptotically minimax empirical Bayes estimation of a sparse normal mean vector
For the important classical problem of inference on a sparse high-dimensional
normal mean vector, we propose a novel empirical Bayes model that admits a
posterior distribution with desirable properties under mild conditions. In
particular, our empirical Bayes posterior distribution concentrates on balls,
centered at the true mean vector, with squared radius proportional to the
minimax rate, and its posterior mean is an asymptotically minimax estimator. We
also show that, asymptotically, the support of our empirical Bayes posterior
has roughly the same effective dimension as the true sparse mean vector.
Simulation from our empirical Bayes posterior is straightforward, and our
numerical results demonstrate the quality of our method compared to others
having similar large-sample properties.Comment: 18 pages, 3 figures, 3 table
An assessment of empirical Bayes and composite estimators for small areas
We compare a set of empirical Bayes and composite estimators of the population means of the districts (small areas) of a country, and show that the natural modelling strategy of searching for a well fitting empirical Bayes model and using it for estimation of the area-level means can be inefficient.Composite estimator, empirical Bayes models, mean squared error, small-area estimation
Empirical Bayes conditional density estimation
The problem of nonparametric estimation of the conditional density of a
response, given a vector of explanatory variables, is classical and of
prominent importance in many prediction problems since the conditional density
provides a more comprehensive description of the association between the
response and the predictor than, for instance, does the regression function.
The problem has applications across different fields like economy, actuarial
sciences and medicine. We investigate empirical Bayes estimation of conditional
densities establishing that an automatic data-driven selection of the prior
hyper-parameters in infinite mixtures of Gaussian kernels, with
predictor-dependent mixing weights, can lead to estimators whose performance is
on par with that of frequentist estimators in being minimax-optimal (up to
logarithmic factors) rate adaptive over classes of locally H\"older smooth
conditional densities and in performing an adaptive dimension reduction if the
response is independent of (some of) the explanatory variables which,
containing no information about the response, are irrelevant to the purpose of
estimating its conditional density
- …