630 research outputs found
Densities for random balanced sampling
A random balanced sample (RBS) is a multivariate distribution with n
components X_1,...,X_n, each uniformly distributed on [-1, 1], such that the
sum of these components is precisely 0. The corresponding vectors X lie in an
(n-1)-dimensional polytope M(n). We present new methods for the construction of
such RBS via densities over M(n) and these apply for arbitrary n. While simple
densities had been known previously for small values of n (namely 2,3 and 4),
for larger n the known distributions with large support were fractal
distributions (with fractal dimension asymptotic to n as n approaches
infinity). Applications of RBS distributions include sampling with antithetic
coupling to reduce variance, and the isolation of nonlinearities. We also show
that the previously known densities (for n<5) are in fact the only solutions in
a natural and very large class of potential RBS densities. This finding
clarifies the need for new methods, such as those presented here.Comment: 20 pages, 6 figures, to appear in Journal of Multivariate Analysi
Bayesian methods to overcome the winner's curse in genetic studies
Parameter estimates for associated genetic variants, report ed in the initial
discovery samples, are often grossly inflated compared to the values observed
in the follow-up replication samples. This type of bias is a consequence of the
sequential procedure in which the estimated effect of an associated genetic
marker must first pass a stringent significance threshold. We propose a
hierarchical Bayes method in which a spike-and-slab prior is used to account
for the possibility that the significant test result may be due to chance. We
examine the robustness of the method using different priors corresponding to
different degrees of confidence in the testing results and propose a Bayesian
model averaging procedure to combine estimates produced by different models.
The Bayesian estimators yield smaller variance compared to the conditional
likelihood estimator and outperform the latter in studies with low power. We
investigate the performance of the method with simulations and applications to
four real data examples.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS373 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Multiprocess parallel antithetic coupling for backward and forward Markov Chain Monte Carlo
Antithetic coupling is a general stratification strategy for reducing Monte
Carlo variance without increasing the simulation size. The use of the
antithetic principle in the Monte Carlo literature typically employs two strata
via antithetic quantile coupling. We demonstrate here that further
stratification, obtained by using k>2 (e.g., k=3-10) antithetically coupled
variates, can offer substantial additional gain in Monte Carlo efficiency, in
terms of both variance and bias. The reason for reduced bias is that
antithetically coupled chains can provide a more dispersed search of the state
space than multiple independent chains. The emerging area of perfect simulation
provides a perfect setting for implementing the k-process parallel antithetic
coupling for MCMC because, without antithetic coupling, this class of methods
delivers genuine independent draws. Furthermore, antithetic backward coupling
provides a very convenient theoretical tool for investigating antithetic
forward coupling. However, the generation of k>2 antithetic variates that are
negatively associated, that is, they preserve negative correlation under
monotone transformations, and extremely antithetic, that is, they are as
negatively correlated as possible, is more complicated compared to the case
with k=2. In this paper, we establish a theoretical framework for investigating
such issues. Among the generating methods that we compare, Latin hypercube
sampling and its iterative extension appear to be general-purpose choices,
making another direct link between Monte Carlo and quasi Monte Carlo.Comment: Published at http://dx.doi.org/10.1214/009053604000001075 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Interacting multiple -- Try algorithms with different proposal distributions
We propose a new class of interacting Markov chain Monte Carlo (MCMC) algorithms designed for increasing the efficiency of a modified multiple-try Metropolis (MTM) algorithm. The extension with respect to the existing MCMC literature is twofold. The sampler proposed extends the basic MTM algorithm by allowing different proposal distributions in the multipletry generation step. We exploit the structure of the MTM algorithm with different proposal distributions to naturally introduce an interacting MTM mechanism (IMTM) that expands the class of population Monte Carlo methods and builds connections with the rapidly expanding world of adaptive MCMC. We show the validity of the algorithm and discuss the choice of the selection weights and of the different proposals. We provide numerical studies which show that the new algorithm can perform better than the basic MTM algorithm and that the interaction mechanism allows the IMTM to efficiently explore the state space.Interacting Monte Carlo, Markov chain Monte Carlo, Multiple-try Metropolis, Population Monte Carlo
- …