9,238 research outputs found

    Improved Adaptive Rejection Metropolis Sampling Algorithms

    Full text link
    Markov Chain Monte Carlo (MCMC) methods, such as the Metropolis-Hastings (MH) algorithm, are widely used for Bayesian inference. One of the most important issues for any MCMC method is the convergence of the Markov chain, which depends crucially on a suitable choice of the proposal density. Adaptive Rejection Metropolis Sampling (ARMS) is a well-known MH scheme that generates samples from one-dimensional target densities making use of adaptive piecewise proposals constructed using support points taken from rejected samples. In this work we pinpoint a crucial drawback in the adaptive procedure in ARMS: support points might never be added inside regions where the proposal is below the target. When this happens in many regions it leads to a poor performance of ARMS, with the proposal never converging to the target. In order to overcome this limitation we propose two improved adaptive schemes for constructing the proposal. The first one is a direct modification of the ARMS procedure that incorporates support points inside regions where the proposal is below the target, while satisfying the diminishing adaptation property, one of the required conditions to assure the convergence of the Markov chain. The second one is an adaptive independent MH algorithm with the ability to learn from all previous samples except for the current state of the chain, thus also guaranteeing the convergence to the invariant density. These two new schemes improve the adaptive strategy of ARMS, thus simplifying the complexity in the construction of the proposals. Numerical results show that the new techniques provide better performance w.r.t. the standard ARMS.Comment: Matlab code provided in http://a2rms.sourceforge.net

    Fast MCMC sampling algorithms on polytopes

    Get PDF
    We propose and analyze two new MCMC sampling algorithms, the Vaidya walk and the John walk, for generating samples from the uniform distribution over a polytope. Both random walks are sampling algorithms derived from interior point methods. The former is based on volumetric-logarithmic barrier introduced by Vaidya whereas the latter uses John's ellipsoids. We show that the Vaidya walk mixes in significantly fewer steps than the logarithmic-barrier based Dikin walk studied in past work. For a polytope in Rd\mathbb{R}^d defined by n>dn >d linear constraints, we show that the mixing time from a warm start is bounded as O(n0.5d1.5)\mathcal{O}(n^{0.5}d^{1.5}), compared to the O(nd)\mathcal{O}(nd) mixing time bound for the Dikin walk. The cost of each step of the Vaidya walk is of the same order as the Dikin walk, and at most twice as large in terms of constant pre-factors. For the John walk, we prove an O(d2.5log4(n/d))\mathcal{O}(d^{2.5}\cdot\log^4(n/d)) bound on its mixing time and conjecture that an improved variant of it could achieve a mixing time of O(d2polylog(n/d))\mathcal{O}(d^2\cdot\text{polylog}(n/d)). Additionally, we propose variants of the Vaidya and John walks that mix in polynomial time from a deterministic starting point. The speed-up of the Vaidya walk over the Dikin walk are illustrated in numerical examples.Comment: 86 pages, 9 figures, First two authors contributed equall

    A statistical test for Nested Sampling algorithms

    Full text link
    Nested sampling is an iterative integration procedure that shrinks the prior volume towards higher likelihoods by removing a "live" point at a time. A replacement point is drawn uniformly from the prior above an ever-increasing likelihood threshold. Thus, the problem of drawing from a space above a certain likelihood value arises naturally in nested sampling, making algorithms that solve this problem a key ingredient to the nested sampling framework. If the drawn points are distributed uniformly, the removal of a point shrinks the volume in a well-understood way, and the integration of nested sampling is unbiased. In this work, I develop a statistical test to check whether this is the case. This "Shrinkage Test" is useful to verify nested sampling algorithms in a controlled environment. I apply the shrinkage test to a test-problem, and show that some existing algorithms fail to pass it due to over-optimisation. I then demonstrate that a simple algorithm can be constructed which is robust against this type of problem. This RADFRIENDS algorithm is, however, inefficient in comparison to MULTINEST.Comment: 11 pages, 7 figures. Published in Statistics and Computing, Springer, September 201
    corecore