We explore two notions of stationary processes. The first is called a
random-step Markov process in which the stationary process of states, (Xi)i∈Z has a stationary coupling with an independent process on the
positive integers, (Li)i∈Z of `random look-back distances'.
That is,
L0 is independent of the `past states', (Xi,Li)i<0, and for every
positive integer n, the probability distribution on the `present', X0,
conditioned on the event {L0=n} and on the past is the same as the
probability distribution on X0 conditioned on the `n-past', (Xi)−n≤i<0 and {L0=n}. A random Markov process is a generalization of a
Markov chain of order n and has the property that the distribution on the
present given the past can be uniformly approximated given the n-past, for
n sufficiently large. Processes with the latter property are called uniform
martingales, closely related to the notion of a `continuous g-function'.
We show that every stationary process on a countable alphabet that is a
uniform martingale and is dominated by a finite measure is also a random Markov
process and that the random variables (Li)i∈Z and associated
coupling can be chosen so that the distribution on the present given the
n-past and the event {L0=n} is `deterministic': all probabilities are
in {0,1}. In the case of finite alphabets, those random-step Markov
processes for which L0 can be chosen with finite expected value are
characterized. For stationary processes on an uncountable alphabet, a stronger
condition is also considered which is sufficient to imply that a process is a
random Markov processes. In addition, a number of examples are given throughout
to show the sharpness of the results.Comment: 31 page