The efficiency of Monte Carlo samplers is dictated not only by energetic
effects, such as large barriers, but also by entropic effects that are due to
the sheer volume that is sampled. The latter effects appear in the form of an
entropic mismatch or divergence between the direct and reverse trial moves. We
provide lower and upper bounds for the average acceptance probability in terms
of the Renyi divergence of order 1/2. We show that the asymptotic finitude of
the entropic divergence is the necessary and sufficient condition for
non-vanishing acceptance probabilities in the limit of large dimensions.
Furthermore, we demonstrate that the upper bound is reasonably tight by showing
that the exponent is asymptotically exact for systems made up of a large number
of independent and identically distributed subsystems. For the last statement,
we provide an alternative proof that relies on the reformulation of the
acceptance probability as a large deviation problem. The reformulation also
leads to a class of low-variance estimators for strongly asymmetric
distributions. We show that the entropy divergence causes a decay in the
average displacements with the number of dimensions n that are simultaneously
updated. For systems that have a well-defined thermodynamic limit, the decay is
demonstrated to be n^{-1/2} for random-walk Monte Carlo and n^{-1/6} for Smart
Monte Carlo (SMC). Numerical simulations of the LJ_38 cluster show that SMC is
virtually as efficient as the Markov chain implementation of the Gibbs sampler,
which is normally utilized for Lennard-Jones clusters. An application of the
entropic inequalities to the parallel tempering method demonstrates that the
number of replicas increases as the square root of the heat capacity of the
system.Comment: minor corrections; the best compromise for the value of the epsilon
parameter in Eq. A9 is now shown to be log(2); 13 pages, 4 figures, to appear
in PR