3,226 research outputs found

    Stability of Noisy Metropolis-Hastings

    Get PDF
    Pseudo-marginal Markov chain Monte Carlo methods for sampling from intractable distributions have gained recent interest and have been theoretically studied in considerable depth. Their main appeal is that they are exact, in the sense that they target marginally the correct invariant distribution. However, the pseudo-marginal Markov chain can exhibit poor mixing and slow convergence towards its target. As an alternative, a subtly different Markov chain can be simulated, where better mixing is possible but the exactness property is sacrificed. This is the noisy algorithm, initially conceptualised as Monte Carlo within Metropolis (MCWM), which has also been studied but to a lesser extent. The present article provides a further characterisation of the noisy algorithm, with a focus on fundamental stability properties like positive recurrence and geometric ergodicity. Sufficient conditions for inheriting geometric ergodicity from a standard Metropolis-Hastings chain are given, as well as convergence of the invariant distribution towards the true target distribution

    Stability of noisy Metropolis–Hastings

    Get PDF

    Stability and examples of some approximate MCMC algorithms.

    Get PDF
    Approximate Monte Carlo algorithms are not uncommon these days, their applicability is related to the possibility of controlling the computational cost by introducing some noise or approximation in the method. We focus on the stability properties of a particular approximate MCMC algorithm, which we term noisy Metropolis-Hastings. Such properties have been studied before in tandem with the pseudo-marginal algorithm, but under fairly strong assumptions. Here, we examine the noisy Metropolis-Hastings algorithm in more detail and explore possible corrective actions for reducing the introduced bias. In this respect, a novel approximate method is presented, motivated by the class of exact algorithms with randomised acceptance. We also discuss some applications and theoretical guarantees of this new approach

    Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels

    Get PDF
    Monte Carlo algorithms often aim to draw from a distribution π\pi by simulating a Markov chain with transition kernel PP such that π\pi is invariant under PP. However, there are many situations for which it is impractical or impossible to draw from the transition kernel PP. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace PP by an approximation P^\hat{P}. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how 'close' the chain given by the transition kernel P^\hat{P} is to the chain given by PP. We apply these results to several examples from spatial statistics and network analysis.Comment: This version: results extended to non-uniformly ergodic Markov chain

    Perturbation theory for Markov chains via Wasserstein distance

    Full text link
    Perturbation theory for Markov chains addresses the question how small differences in the transitions of Markov chains are reflected in differences between their distributions. We prove powerful and flexible bounds on the distance of the nnth step distributions of two Markov chains when one of them satisfies a Wasserstein ergodicity condition. Our work is motivated by the recent interest in approximate Markov chain Monte Carlo (MCMC) methods in the analysis of big data sets. By using an approach based on Lyapunov functions, we provide estimates for geometrically ergodic Markov chains under weak assumptions. In an autoregressive model, our bounds cannot be improved in general. We illustrate our theory by showing quantitative estimates for approximate versions of two prominent MCMC algorithms, the Metropolis-Hastings and stochastic Langevin algorithms.Comment: 31 pages, accepted at Bernoulli Journa

    Quasi-Newton particle Metropolis-Hastings

    Full text link
    Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new proposal inspired by quasi-Newton algorithms that may achieve similar (or better) mixing with less tuning. An advantage compared to other Hessian based proposals, is that it only requires estimates of the gradient of the log-posterior. A possible application is parameter inference in the challenging class of SSMs with intractable likelihoods. We exemplify this application and the benefits of the new proposal by modelling log-returns of future contracts on coffee by a stochastic volatility model with α\alpha-stable observations.Comment: 23 pages, 5 figures. Accepted for the 17th IFAC Symposium on System Identification (SYSID), Beijing, China, October 201

    Geometric ergodicity of the Random Walk Metropolis with position-dependent proposal covariance

    Get PDF
    We consider a Metropolis-Hastings method with proposal kernel N(x,hG1(x))\mathcal{N}(x,hG^{-1}(x)), where xx is the current state. After discussing specific cases from the literature, we analyse the ergodicity properties of the resulting Markov chains. In one dimension we find that suitable choice of G1(x)G^{-1}(x) can change the ergodicity properties compared to the Random Walk Metropolis case N(x,hΣ)\mathcal{N}(x,h\Sigma), either for the better or worse. In higher dimensions we use a specific example to show that judicious choice of G1(x)G^{-1}(x) can produce a chain which will converge at a geometric rate to its limiting distribution when probability concentrates on an ever narrower ridge as x|x| grows, something which is not true for the Random Walk Metropolis.Comment: 15 pages + appendices, 4 figure
    corecore