2,088 research outputs found

    Prediction, Retrodiction, and The Amount of Information Stored in the Present

    Get PDF
    We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.Comment: 17 pages, 7 figures, 1 table; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/pratisp.ht

    Information Accessibility and Cryptic Processes: Linear Combinations of Causal States

    Get PDF
    We show in detail how to determine the time-reversed representation of a stationary hidden stochastic process from linear combinations of its forward-time ϵ\epsilon-machine causal states. This also gives a check for the kk-cryptic expansion recently introduced to explore the temporal range over which internal state information is spread.Comment: 6 pages, 9 figures, 2 tables; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/iacplcocs.ht

    Understanding interdependency through complex information sharing

    Full text link
    The interactions between three or more random variables are often nontrivial, poorly understood, and yet, are paramount for future advances in fields such as network information theory, neuroscience, genetics and many others. In this work, we propose to analyze these interactions as different modes of information sharing. Towards this end, we introduce a novel axiomatic framework for decomposing the joint entropy, which characterizes the various ways in which random variables can share information. The key contribution of our framework is to distinguish between interdependencies where the information is shared redundantly, and synergistic interdependencies where the sharing structure exists in the whole but not between the parts. We show that our axioms determine unique formulas for all the terms of the proposed decomposition for a number of cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.Comment: 39 pages, 4 figure

    The Past and the Future in the Present

    Full text link
    We show how the shared information between the past and future---the excess entropy---derives from the components of directional information stored in the present---the predictive and retrodictive causal states. A detailed proof allows us to highlight a number of the subtle problems in estimation and analysis that impede accurate calculation of the excess entropy.Comment: 7 pages, 1 figure; http://cse.ucdavis.edu/~cmg/compmech/pubs/pafip.ht

    Intersection Information based on Common Randomness

    Get PDF
    The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of "the same information" two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the G\'acs-K\"orner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement.Comment: 19 pages, 5 figure

    Time's Barbed Arrow: Irreversibility, Crypticity, and Stored Information

    Full text link
    We show why the amount of information communicated between the past and future--the excess entropy--is not in general the amount of information stored in the present--the statistical complexity. This is a puzzle, and a long-standing one, since the latter is what is required for optimal prediction, but the former describes observed behavior. We layout a classification scheme for dynamical systems and stochastic processes that determines when these two quantities are the same or different. We do this by developing closed-form expressions for the excess entropy in terms of optimal causal predictors and retrodictors--the epsilon-machines of computational mechanics. A process's causal irreversibility and crypticity are key determining properties.Comment: 4 pages, 2 figure

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl
    corecore