8 research outputs found

    Perturbation theory for Markov chains via Wasserstein distance

    Full text link
    Perturbation theory for Markov chains addresses the question how small differences in the transitions of Markov chains are reflected in differences between their distributions. We prove powerful and flexible bounds on the distance of the nnth step distributions of two Markov chains when one of them satisfies a Wasserstein ergodicity condition. Our work is motivated by the recent interest in approximate Markov chain Monte Carlo (MCMC) methods in the analysis of big data sets. By using an approach based on Lyapunov functions, we provide estimates for geometrically ergodic Markov chains under weak assumptions. In an autoregressive model, our bounds cannot be improved in general. We illustrate our theory by showing quantitative estimates for approximate versions of two prominent MCMC algorithms, the Metropolis-Hastings and stochastic Langevin algorithms.Comment: 31 pages, accepted at Bernoulli Journa
    corecore