1,267 research outputs found

    Subgeometric ergodicity of Markov chains

    Full text link
    The goal of this paper is to give a short and self contained proof of general bounds for subgeometric rates of convergence, under practical conditions. The main result whose proof, based on coupling, provides an intuitive understanding of the results of Nummelin and Tuominen (1983) and Tuominen and Tweedie (1994). To obtain practical rates, a very general drift condition, recently introduced in Douc et al (2004) is used

    Exponential convergence rate of ruin probabilities for level-dependent L\'evy-driven risk processes

    Get PDF
    We explicitly find the rate of exponential long-term convergence for the ruin probability in a level-dependent L\'evy-driven risk model, as time goes to infinity. Siegmund duality allows to reduce the pro blem to long-term convergence of a reflected jump-diffusion to its stationary distribution, which is handled via Lyapunov functions.Comment: 20 pages, 5 figure

    Renewal theory and computable convergence rates for geometrically ergodic Markov chains

    Full text link
    We give computable bounds on the rate of convergence of the transition probabilities to the stationary distribution for a certain class of geometrically ergodic Markov chains. Our results are different from earlier estimates of Meyn and Tweedie, and from estimates using coupling, although we start from essentially the same assumptions of a drift condition toward a ``small set.'' The estimates show a noticeable improvement on existing results if the Markov chain is reversible with respect to its stationary distribution, and especially so if the chain is also positive. The method of proof uses the first-entrance-last-exit decomposition, together with new quantitative versions of a result of Kendall from discrete renewal theory.Comment: Published at http://dx.doi.org/10.1214/105051604000000710 in the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Strong Stationary Duality for M\"obius Monotone Markov Chains: Unreliable Networks

    Full text link
    For Markov chains with a partially ordered finite state space we show strong stationary duality under the condition of M\"obius monotonicity of the chain. We show relations of M\"obius monotonicity to other definitions of monotone chains. We give examples of dual chains in this context which have transitions only upwards. We illustrate general theory by an analysis of nonsymmetric random walks on the cube with an application to networks of queues

    Speeding up the FMMR perfect sampling algorithm: A case study revisited

    Full text link
    In a previous paper by the second author,two Markov chain Monte Carlo perfect sampling algorithms -- one called coupling from the past (CFTP) and the other (FMMR) based on rejection sampling -- are compared using as a case study the move-to-front (MTF) self-organizing list chain. Here we revisit that case study and, in particular, exploit the dependence of FMMR on the user-chosen initial state. We give a stochastic monotonicity result for the running time of FMMR applied to MTF and thus identify the initial state that gives the stochastically smallest running time; by contrast, the initial state used in the previous study gives the stochastically largest running time. By changing from worst choice to best choice of initial state we achieve remarkable speedup of FMMR for MTF; for example, we reduce the running time (as measured in Markov chain steps) from exponential in the length n of the list nearly down to n when the items in the list are requested according to a geometric distribution. For this same example, the running time for CFTP grows exponentially in n.Comment: 19 pages. See also http://www.mts.jhu.edu/~fill/ and http://www.mathcs.carleton.edu/faculty/bdobrow/. Submitted for publication in May, 200
    corecore