14 research outputs found

    Random-step Markov processes

    Get PDF
    We explore two notions of stationary processes. The first is called a random-step Markov process in which the stationary process of states, (Xi)i∈Z(X_i)_{i \in \mathbb{Z}} has a stationary coupling with an independent process on the positive integers, (Li)i∈Z(L_i)_{i \in \mathbb{Z}} of `random look-back distances'. That is, L0L_0 is independent of the `past states', (Xi,Li)i<0(X_i, L_i)_{i<0}, and for every positive integer nn, the probability distribution on the `present', X0X_0, conditioned on the event {L0=n}\{L_0 = n\} and on the past is the same as the probability distribution on X0X_0 conditioned on the `nn-past', (Xi)−n≤i<0(X_i)_{-n\leq i <0} and {L0=n}\{L_0 = n\}. A random Markov process is a generalization of a Markov chain of order nn and has the property that the distribution on the present given the past can be uniformly approximated given the nn-past, for nn sufficiently large. Processes with the latter property are called uniform martingales, closely related to the notion of a `continuous gg-function'. We show that every stationary process on a countable alphabet that is a uniform martingale and is dominated by a finite measure is also a random Markov process and that the random variables (Li)i∈Z(L_i)_{i \in \mathbb{Z}} and associated coupling can be chosen so that the distribution on the present given the nn-past and the event {L0=n}\{L_0 = n\} is `deterministic': all probabilities are in {0,1}\{0,1\}. In the case of finite alphabets, those random-step Markov processes for which L0L_0 can be chosen with finite expected value are characterized. For stationary processes on an uncountable alphabet, a stronger condition is also considered which is sufficient to imply that a process is a random Markov processes. In addition, a number of examples are given throughout to show the sharpness of the results.Comment: 31 page

    The Linus Sequence

    Get PDF
    Define the Linus sequence Ln for n ≥ 1 as a 0–1 sequence with L1 = 0, and Ln chosen so as to minimize the length of the longest immediately repeated block Ln−2r+1 Ln−r = Ln−r+1 Ln. Define the Sally sequence Sn as the length r of the longest repeated block that was avoided by the choice of Ln. We prove several results about these sequences, such as exponential decay of the frequency of highly periodic subwords of the Linus sequence, zero entropy of any stationary process obtained as a limit of word frequencies in the Linus sequence and infinite average value of the Sally sequence. In addition we make a number of conjectures about both sequences

    Non-intersecting splitting σ

    No full text

    An Outline of Ergodic Theory

    No full text
    An engaging introduction to ergodic theory for graduate students, and a useful reference for the professional mathematician

    Twofold mixing implies threefold mixing for rank one transformations

    No full text

    Nonuniqueness ing-functions

    No full text
    corecore