9,238 research outputs found
Improved Adaptive Rejection Metropolis Sampling Algorithms
Markov Chain Monte Carlo (MCMC) methods, such as the Metropolis-Hastings (MH)
algorithm, are widely used for Bayesian inference. One of the most important
issues for any MCMC method is the convergence of the Markov chain, which
depends crucially on a suitable choice of the proposal density. Adaptive
Rejection Metropolis Sampling (ARMS) is a well-known MH scheme that generates
samples from one-dimensional target densities making use of adaptive piecewise
proposals constructed using support points taken from rejected samples. In this
work we pinpoint a crucial drawback in the adaptive procedure in ARMS: support
points might never be added inside regions where the proposal is below the
target. When this happens in many regions it leads to a poor performance of
ARMS, with the proposal never converging to the target. In order to overcome
this limitation we propose two improved adaptive schemes for constructing the
proposal. The first one is a direct modification of the ARMS procedure that
incorporates support points inside regions where the proposal is below the
target, while satisfying the diminishing adaptation property, one of the
required conditions to assure the convergence of the Markov chain. The second
one is an adaptive independent MH algorithm with the ability to learn from all
previous samples except for the current state of the chain, thus also
guaranteeing the convergence to the invariant density. These two new schemes
improve the adaptive strategy of ARMS, thus simplifying the complexity in the
construction of the proposals. Numerical results show that the new techniques
provide better performance w.r.t. the standard ARMS.Comment: Matlab code provided in http://a2rms.sourceforge.net
Fast MCMC sampling algorithms on polytopes
We propose and analyze two new MCMC sampling algorithms, the Vaidya walk and
the John walk, for generating samples from the uniform distribution over a
polytope. Both random walks are sampling algorithms derived from interior point
methods. The former is based on volumetric-logarithmic barrier introduced by
Vaidya whereas the latter uses John's ellipsoids. We show that the Vaidya walk
mixes in significantly fewer steps than the logarithmic-barrier based Dikin
walk studied in past work. For a polytope in defined by
linear constraints, we show that the mixing time from a warm start is bounded
as , compared to the mixing time
bound for the Dikin walk. The cost of each step of the Vaidya walk is of the
same order as the Dikin walk, and at most twice as large in terms of constant
pre-factors. For the John walk, we prove an
bound on its mixing time and conjecture
that an improved variant of it could achieve a mixing time of
. Additionally, we propose variants
of the Vaidya and John walks that mix in polynomial time from a deterministic
starting point. The speed-up of the Vaidya walk over the Dikin walk are
illustrated in numerical examples.Comment: 86 pages, 9 figures, First two authors contributed equall
A statistical test for Nested Sampling algorithms
Nested sampling is an iterative integration procedure that shrinks the prior
volume towards higher likelihoods by removing a "live" point at a time. A
replacement point is drawn uniformly from the prior above an ever-increasing
likelihood threshold. Thus, the problem of drawing from a space above a certain
likelihood value arises naturally in nested sampling, making algorithms that
solve this problem a key ingredient to the nested sampling framework. If the
drawn points are distributed uniformly, the removal of a point shrinks the
volume in a well-understood way, and the integration of nested sampling is
unbiased. In this work, I develop a statistical test to check whether this is
the case. This "Shrinkage Test" is useful to verify nested sampling algorithms
in a controlled environment. I apply the shrinkage test to a test-problem, and
show that some existing algorithms fail to pass it due to over-optimisation. I
then demonstrate that a simple algorithm can be constructed which is robust
against this type of problem. This RADFRIENDS algorithm is, however,
inefficient in comparison to MULTINEST.Comment: 11 pages, 7 figures. Published in Statistics and Computing, Springer,
September 201
- …