2,202 research outputs found
Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with
Exploration of the intractable posterior distributions associated with
Bayesian versions of the general linear mixed model is often performed using
Markov chain Monte Carlo. In particular, if a conditionally conjugate prior is
used, then there is a simple two-block Gibbs sampler available. Rom\'{a}n and
Hobert [Linear Algebra Appl. 473 (2015) 54-77] showed that, when the priors are
proper and the matrix has full column rank, the Markov chains underlying
these Gibbs samplers are nearly always geometrically ergodic. In this paper,
Rom\'{a}n and Hobert's (2015) result is extended by allowing improper priors on
the variance components, and, more importantly, by removing all assumptions on
the matrix. So, not only is allowed to be (column) rank deficient,
which provides additional flexibility in parameterizing the fixed effects, it
is also allowed to have more columns than rows, which is necessary in the
increasingly important situation where . The full rank assumption on
is at the heart of Rom\'{a}n and Hobert's (2015) proof. Consequently, the
extension to unrestricted requires a substantially different analysis.Comment: Published at http://dx.doi.org/10.3150/15-BEJ749 in the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Sufficient burn-in for Gibbs samplers for a hierarchical random effects model
We consider Gibbs and block Gibbs samplers for a Bayesian hierarchical
version of the one-way random effects model. Drift and minorization conditions
are established for the underlying Markov chains. The drift and minorization
are used in conjunction with results from J. S. Rosenthal [J. Amer. Statist.
Assoc. 90 (1995) 558-566] and G. O. Roberts and R. L. Tweedie [Stochastic
Process. Appl. 80 (1999) 211-229] to construct analytical upper bounds on the
distance to stationarity. These lead to upper bounds on the amount of burn-in
that is required to get the chain within a prespecified (total variation)
distance of the stationary distribution. The results are illustrated with a
numerical example
Estimating the spectral gap of a trace-class Markov operator
The utility of a Markov chain Monte Carlo algorithm is, in large part,
determined by the size of the spectral gap of the corresponding Markov
operator. However, calculating (and even approximating) the spectral gaps of
practical Monte Carlo Markov chains in statistics has proven to be an extremely
difficult and often insurmountable task, especially when these chains move on
continuous state spaces. In this paper, a method for accurate estimation of the
spectral gap is developed for general state space Markov chains whose operators
are non-negative and trace-class. The method is based on the fact that the
second largest eigenvalue (and hence the spectral gap) of such operators can be
bounded above and below by simple functions of the power sums of the
eigenvalues. These power sums often have nice integral representations. A
classical Monte Carlo method is proposed to estimate these integrals, and a
simple sufficient condition for finite variance is provided. This leads to
asymptotically valid confidence intervals for the second largest eigenvalue
(and the spectral gap) of the Markov operator. In contrast with previously
existing techniques, our method is not based on a near-stationary version of
the Markov chain, which, paradoxically, cannot be obtained in a principled
manner without bounds on the spectral gap. On the other hand, it can be quite
expensive from a computational standpoint. The efficiency of the method is
studied both theoretically and empirically
When is Eaton's Markov chain irreducible?
Consider a parametric statistical model and an
improper prior distribution that together yield a
(proper) formal posterior distribution . The prior is
called strongly admissible if the generalized Bayes estimator of every bounded
function of is admissible under squared error loss. Eaton [Ann.
Statist. 20 (1992) 1147--1179] has shown that a sufficient condition for strong
admissibility of is the local recurrence of the Markov chain whose
transition function is . Applications of this result and its
extensions are often greatly simplified when the Markov chain associated with
is irreducible. However, establishing irreducibility can be difficult. In
this paper, we provide a characterization of irreducibility for general state
space Markov chains and use this characterization to develop an easily checked,
necessary and sufficient condition for irreducibility of Eaton's Markov chain.
All that is required to check this condition is a simple examination of and
. Application of the main result is illustrated using two examples.Comment: Published at http://dx.doi.org/10.3150/07-BEJ6191 in the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- …