slides

Random-step Markov processes

Abstract

We explore two notions of stationary processes. The first is called a random-step Markov process in which the stationary process of states, (Xi)iZ(X_i)_{i \in \mathbb{Z}} has a stationary coupling with an independent process on the positive integers, (Li)iZ(L_i)_{i \in \mathbb{Z}} of `random look-back distances'. That is, L0L_0 is independent of the `past states', (Xi,Li)i<0(X_i, L_i)_{i<0}, and for every positive integer nn, the probability distribution on the `present', X0X_0, conditioned on the event {L0=n}\{L_0 = n\} and on the past is the same as the probability distribution on X0X_0 conditioned on the `nn-past', (Xi)ni<0(X_i)_{-n\leq i <0} and {L0=n}\{L_0 = n\}. A random Markov process is a generalization of a Markov chain of order nn and has the property that the distribution on the present given the past can be uniformly approximated given the nn-past, for nn sufficiently large. Processes with the latter property are called uniform martingales, closely related to the notion of a `continuous gg-function'. We show that every stationary process on a countable alphabet that is a uniform martingale and is dominated by a finite measure is also a random Markov process and that the random variables (Li)iZ(L_i)_{i \in \mathbb{Z}} and associated coupling can be chosen so that the distribution on the present given the nn-past and the event {L0=n}\{L_0 = n\} is `deterministic': all probabilities are in {0,1}\{0,1\}. In the case of finite alphabets, those random-step Markov processes for which L0L_0 can be chosen with finite expected value are characterized. For stationary processes on an uncountable alphabet, a stronger condition is also considered which is sufficient to imply that a process is a random Markov processes. In addition, a number of examples are given throughout to show the sharpness of the results.Comment: 31 page

    Similar works