2,149 research outputs found

    The ergodic decomposition of asymptotically mean stationary random sources

    Get PDF
    It is demonstrated how to represent asymptotically mean stationary (AMS) random sources with values in standard spaces as mixtures of ergodic AMS sources. This an extension of the well known decomposition of stationary sources which has facilitated the generalization of prominent source coding theorems to arbitrary, not necessarily ergodic, stationary sources. Asymptotic mean stationarity generalizes the definition of stationarity and covers a much larger variety of real-world examples of random sources of practical interest. It is sketched how to obtain source coding and related theorems for arbitrary, not necessarily ergodic, AMS sources, based on the presented ergodic decomposition.Comment: Submitted to IEEE Transactions on Information Theory, Apr. 200

    Estimation of the Rate-Distortion Function

    Full text link
    Motivated by questions in lossy data compression and by theoretical considerations, we examine the problem of estimating the rate-distortion function of an unknown (not necessarily discrete-valued) source from empirical data. Our focus is the behavior of the so-called "plug-in" estimator, which is simply the rate-distortion function of the empirical distribution of the observed data. Sufficient conditions are given for its consistency, and examples are provided to demonstrate that in certain cases it fails to converge to the true rate-distortion function. The analysis of its performance is complicated by the fact that the rate-distortion function is not continuous in the source distribution; the underlying mathematical problem is closely related to the classical problem of establishing the consistency of maximum likelihood estimators. General consistency results are given for the plug-in estimator applied to a broad class of sources, including all stationary and ergodic ones. A more general class of estimation problems is also considered, arising in the context of lossy data compression when the allowed class of coding distributions is restricted; analogous results are developed for the plug-in estimator in that case. Finally, consistency theorems are formulated for modified (e.g., penalized) versions of the plug-in, and for estimating the optimal reproduction distribution.Comment: 18 pages, no figures [v2: removed an example with an error; corrected typos; a shortened version will appear in IEEE Trans. Inform. Theory

    Variable-Length Coding of Two-Sided Asymptotically Mean Stationary Measures

    Full text link
    We collect several observations that concern variable-length coding of two-sided infinite sequences in a probabilistic setting. Attention is paid to images and preimages of asymptotically mean stationary measures defined on subsets of these sequences. We point out sufficient conditions under which the variable-length coding and its inverse preserve asymptotic mean stationarity. Moreover, conditions for preservation of shift-invariant σ\sigma-fields and the finite-energy property are discussed and the block entropies for stationary means of coded processes are related in some cases. Subsequently, we apply certain of these results to construct a stationary nonergodic process with a desired linguistic interpretation.Comment: 20 pages. A few typos corrected after the journal publicatio

    Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Non-stationary/Unstable Linear Systems

    Full text link
    Stabilization of non-stationary linear systems over noisy communication channels is considered. Stochastically stable sources, and unstable but noise-free or bounded-noise systems have been extensively studied in information theory and control theory literature since 1970s, with a renewed interest in the past decade. There have also been studies on non-causal and causal coding of unstable/non-stationary linear Gaussian sources. In this paper, tight necessary and sufficient conditions for stochastic stabilizability of unstable (non-stationary) possibly multi-dimensional linear systems driven by Gaussian noise over discrete channels (possibly with memory and feedback) are presented. Stochastic stability notions include recurrence, asymptotic mean stationarity and sample path ergodicity, and the existence of finite second moments. Our constructive proof uses random-time state-dependent stochastic drift criteria for stabilization of Markov chains. For asymptotic mean stationarity (and thus sample path ergodicity), it is sufficient that the capacity of a channel is (strictly) greater than the sum of the logarithms of the unstable pole magnitudes for memoryless channels and a class of channels with memory. This condition is also necessary under a mild technical condition. Sufficient conditions for the existence of finite average second moments for such systems driven by unbounded noise are provided.Comment: To appear in IEEE Transactions on Information Theor

    On the Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts

    Full text link
    The article presents a new interpretation for Zipf-Mandelbrot's law in natural language which rests on two areas of information theory. Firstly, we construct a new class of grammar-based codes and, secondly, we investigate properties of strongly nonergodic stationary processes. The motivation for the joint discussion is to prove a proposition with a simple informal statement: If a text of length nn describes nβn^\beta independent facts in a repetitive way then the text contains at least nβ/lognn^\beta/\log n different words, under suitable conditions on nn. In the formal statement, two modeling postulates are adopted. Firstly, the words are understood as nonterminal symbols of the shortest grammar-based encoding of the text. Secondly, the text is assumed to be emitted by a finite-energy strongly nonergodic source whereas the facts are binary IID variables predictable in a shift-invariant way.Comment: 24 pages, no figure

    On analytic properties of entropy rate

    Get PDF
    Entropy rate of discrete random sources are a real valued functional on the space of probability measures associated with the random sources. If one equips this space with a topology one can ask for the analytic properties of the entropy rates. A natural choice is the topology, which is induced by the norm of total variation. A central result is that entropy rate is Lipschitz continuous relative to this topology. The consequences are manifold. First, corollaries are obtained that refer to prevalent objects of probability theory. Second, the result is extended to entropy rate of dynamical systems. Third, it is shown how to exploit the proof schemes to give a direct and elementary proof for the existence of entropy rate of asymptotically mean stationary random sources

    Mixing, Ergodic, and Nonergodic Processes with Rapidly Growing Information between Blocks

    Full text link
    We construct mixing processes over an infinite alphabet and ergodic processes over a finite alphabet for which Shannon mutual information between adjacent blocks of length nn grows as nβn^\beta, where β(0,1)\beta\in(0,1). The processes are a modification of nonergodic Santa Fe processes, which were introduced in the context of natural language modeling. The rates of mutual information for the latter processes are alike and also established in this paper. As an auxiliary result, it is shown that infinite direct products of mixing processes are also mixing.Comment: 21 page
    corecore