270 research outputs found

    Prediction, Retrodiction, and The Amount of Information Stored in the Present

    Get PDF
    We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.Comment: 17 pages, 7 figures, 1 table; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/pratisp.ht

    A note on retrodiction and machine evolution

    Full text link
    Biomolecular communication demands that interactions between parts of a molecular system act as scaffolds for message transmission. It also requires an evolving and organized system of signs - a communicative agency - for creating and transmitting meaning. Here I explore the need to dissect biomolecular communication with retrodiction approaches that make claims about the past given information that is available in the present. While the passage of time restricts the explanatory power of retrodiction, the use of molecular structure in biology offsets information erosion. This allows description of the gradual evolutionary rise of structural and functional innovations in RNA and proteins. The resulting chronologies can also describe the gradual rise of molecular machines of increasing complexity and computation capabilities. For example, the accretion of rRNA substructures and ribosomal proteins can be traced in time and placed within a geological timescale. Phylogenetic, algorithmic and theoretical-inspired accretion models can be reconciled into a congruent evolutionary model. Remarkably, the time of origin of enzymes, functional RNA, non-ribosomal peptide synthetase (NRPS) complexes, and ribosomes suggest they gradually climbed Chomsky's hierarchy of formal grammars, supporting the gradual complexification of machines and communication in molecular biology. Future retrodiction approaches and in-depth exploration of theoretical models of computation will need to confirm such evolutionary progression.Comment: 7 pages, 1 figur

    Information Anatomy of Stochastic Equilibria

    Full text link
    A stochastic nonlinear dynamical system generates information, as measured by its entropy rate. Some---the ephemeral information---is dissipated and some---the bound information---is actively stored and so affects future behavior. We derive analytic expressions for the ephemeral and bound informations in the limit of small-time discretization for two classical systems that exhibit dynamical equilibria: first-order Langevin equations (i) where the drift is the gradient of a potential function and the diffusion matrix is invertible and (ii) with a linear drift term (Ornstein-Uhlenbeck) but a noninvertible diffusion matrix. In both cases, the bound information is sensitive only to the drift, while the ephemeral information is sensitive only to the diffusion matrix and not to the drift. Notably, this information anatomy changes discontinuously as any of the diffusion coefficients vanishes, indicating that it is very sensitive to the noise structure. We then calculate the information anatomy of the stochastic cusp catastrophe and of particles diffusing in a heat bath in the overdamped limit, both examples of stochastic gradient descent on a potential landscape. Finally, we use our methods to calculate and compare approximations for the so-called time-local predictive information for adaptive agents.Comment: 35 pages, 3 figures, 1 table; http://csc.ucdavis.edu/~cmg/compmech/pubs/iase.ht

    Quantifying Self-Organization with Optimal Wavelets

    Full text link
    The optimal wavelet basis is used to develop quantitative, experimentally applicable criteria for self-organization. The choice of the optimal wavelet is based on the model of self-organization in the wavelet tree. The framework of the model is founded on the wavelet-domain hidden Markov model and the optimal wavelet basis criterion for self-organization which assumes inherent increase in statistical complexity, the information content necessary for maximally accurate prediction of the system's dynamics. At the same time the method, presented here for the one-dimensional data of any type, performs superior denoising and may be easily generalized to higher dimensions.Comment: 12 pages, 3 figure

    Adaptive Optical Phase Estimation Using Time-Symmetric Quantum Smoothing

    Get PDF
    Quantum parameter estimation has many applications, from gravitational wave detection to quantum key distribution. We present the first experimental demonstration of the time-symmetric technique of quantum smoothing. We consider both adaptive and non-adaptive quantum smoothing, and show that both are better than their well-known time-asymmetric counterparts (quantum filtering). For the problem of estimating a stochastically varying phase shift on a coherent beam, our theory predicts that adaptive quantum smoothing (the best scheme) gives an estimate with a mean-square error up to 222\sqrt{2} times smaller than that from non-adaptive quantum filtering (the standard quantum limit). The experimentally measured improvement is 2.24±0.142.24 \pm 0.14

    The Weak Reality that Makes Quantum Phenomena more Natural: Novel Insights and Experiments

    Get PDF
    While quantum reality can be probed through measurements, the Two-State-Vector formalism (TSVF) reveals a subtler reality prevailing between measurements. Under special pre- and post-selections, odd physical values emerge. This unusual picture calls for a deeper study. Instead of the common, wave-based picture of quantum mechanics, we suggest a new, particle-based perspective: Each particle possesses a definite location throughout its evolution, while some of its physical variables (characterized by deterministic operators, some of which obey nonlocal equations of motion) are carried by "mirage particles" accounting for its unique behavior. Within the time-interval between pre- and post-selection, the particle gives rise to a horde of such mirage particles, of which some can be negative. What appears to be "no-particle," known to give rise to Interaction-Free Measurement, is in fact a self-canceling pair of positive and negative mirage particles, which can be momentarily split and cancel out again. Feasible experiments can give empirical evidence for these fleeting phenomena. In this respect, the Heisenberg ontology is shown to be conceptually advantageous compared to the Schr\"odinger picture. We review several recent advances, discuss their foundational significance and point out possible directions for future research.Comment: An updated version was accepted to Entrop

    How Hidden are Hidden Processes? A Primer on Crypticity and Entropy Convergence

    Full text link
    We investigate a stationary process's crypticity---a measure of the difference between its hidden state information and its observed information---using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. Hopefully, these will inspire new applications in spatially extended and network dynamical systems.Comment: 18 pages, 18 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/iacp2.ht

    Informational and Causal Architecture of Discrete-Time Renewal Processes

    Full text link
    Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states), calculate the historical memory capacity required to store those states (statistical complexity), delineate what information is predictable (excess entropy), and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use these formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state epsilon-machine presentation. All in all, the results lay the groundwork for analyzing processes with infinite statistical complexity and infinite excess entropy.Comment: 18 pages, 9 figures, 1 table; http://csc.ucdavis.edu/~cmg/compmech/pubs/dtrp.ht
    • …
    corecore