784 research outputs found

    Prediction, Retrodiction, and The Amount of Information Stored in the Present

    Get PDF
    We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.Comment: 17 pages, 7 figures, 1 table; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/pratisp.ht

    Information Accessibility and Cryptic Processes: Linear Combinations of Causal States

    Get PDF
    We show in detail how to determine the time-reversed representation of a stationary hidden stochastic process from linear combinations of its forward-time ϵ\epsilon-machine causal states. This also gives a check for the kk-cryptic expansion recently introduced to explore the temporal range over which internal state information is spread.Comment: 6 pages, 9 figures, 2 tables; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/iacplcocs.ht

    Specific Performance of Land Contracts

    Get PDF

    Following Trust Funds

    Get PDF

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl

    Time's Barbed Arrow: Irreversibility, Crypticity, and Stored Information

    Full text link
    We show why the amount of information communicated between the past and future--the excess entropy--is not in general the amount of information stored in the present--the statistical complexity. This is a puzzle, and a long-standing one, since the latter is what is required for optimal prediction, but the former describes observed behavior. We layout a classification scheme for dynamical systems and stochastic processes that determines when these two quantities are the same or different. We do this by developing closed-form expressions for the excess entropy in terms of optimal causal predictors and retrodictors--the epsilon-machines of computational mechanics. A process's causal irreversibility and crypticity are key determining properties.Comment: 4 pages, 2 figure

    Prediction and Generation of Binary Markov Processes: Can a Finite-State Fox Catch a Markov Mouse?

    Get PDF
    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.Comment: 12 pages, 12 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/gmc.ht

    Many Roads to Synchrony: Natural Time Scales and Their Algorithms

    Full text link
    We consider two important time scales---the Markov and cryptic orders---that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the epsilon-machine, a process's minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the epsilon-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.Comment: 17 pages, 16 figures: http://cse.ucdavis.edu/~cmg/compmech/pubs/kro.htm. Santa Fe Institute Working Paper 10-11-02
    • …
    corecore