754 research outputs found

    Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel

    Full text link
    A stochastic process's statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process's cryptic order---a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost---one trades off prediction for generation complexity.Comment: 10 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/oqs.ht

    Optimizing Quantum Models of Classical Channels: The reverse Holevo problem

    Get PDF
    Given a classical channel---a stochastic map from inputs to outputs---the input can often be transformed to an intermediate variable that is informationally smaller than the input. The new channel accurately simulates the original but at a smaller transmission rate. Here, we examine this procedure when the intermediate variable is a quantum state. We determine when and how well quantum simulations of classical channels may improve upon the minimal rates of classical simulation. This inverts Holevo's original question of quantifying the capacity of quantum channels with classical resources. We also show that this problem is equivalent to another, involving the local generation of a distribution from common entanglement.Comment: 13 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/qfact.htm; substantially updated from v

    Reductions of Hidden Information Sources

    Full text link
    In all but special circumstances, measurements of time-dependent processes reflect internal structures and correlations only indirectly. Building predictive models of such hidden information sources requires discovering, in some way, the internal states and mechanisms. Unfortunately, there are often many possible models that are observationally equivalent. Here we show that the situation is not as arbitrary as one would think. We show that generators of hidden stochastic processes can be reduced to a minimal form and compare this reduced representation to that provided by computational mechanics--the epsilon-machine. On the way to developing deeper, measure-theoretic foundations for the latter, we introduce a new two-step reduction process. The first step (internal-event reduction) produces the smallest observationally equivalent sigma-algebra and the second (internal-state reduction) removes sigma-algebra components that are redundant for optimal prediction. For several classes of stochastic dynamical systems these reductions produce representations that are equivalent to epsilon-machines.Comment: 12 pages, 4 figures; 30 citations; Updates at http://www.santafe.edu/~cm

    Information Accessibility and Cryptic Processes: Linear Combinations of Causal States

    Get PDF
    We show in detail how to determine the time-reversed representation of a stationary hidden stochastic process from linear combinations of its forward-time ϵ\epsilon-machine causal states. This also gives a check for the kk-cryptic expansion recently introduced to explore the temporal range over which internal state information is spread.Comment: 6 pages, 9 figures, 2 tables; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/iacplcocs.ht

    Prediction, Retrodiction, and The Amount of Information Stored in the Present

    Get PDF
    We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.Comment: 17 pages, 7 figures, 1 table; http://users.cse.ucdavis.edu/~cmg/compmech/pubs/pratisp.ht

    Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems

    Full text link
    Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system's observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decompose the system's thermodynamic entropy density into a localized entropy, that solely contained in the dynamics at a single location, and a bound entropy, that stored in space as domains, clusters, excitations, or other emergent structures. We compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, the Bethe lattice with coordination number k=3, and the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that different spin motifs play (in cluster bulk, cluster edges, and the like) and how these affect the dependencies between spins.Comment: 12 pages, 8 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/ising_bmu.ht

    Extreme Quantum Advantage for Rare-Event Sampling

    Get PDF
    We introduce a quantum algorithm for efficient biased sampling of the rare events generated by classical memoryful stochastic processes. We show that this quantum algorithm gives an extreme advantage over known classical biased sampling algorithms in terms of the memory resources required. The quantum memory advantage ranges from polynomial to exponential and when sampling the rare equilibrium configurations of spin systems the quantum advantage diverges.Comment: 11 pages, 9 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/eqafbs.ht

    The Computational Complexity of Symbolic Dynamics at the Onset of Chaos

    Full text link
    In a variety of studies of dynamical systems, the edge of order and chaos has been singled out as a region of complexity. It was suggested by Wolfram, on the basis of qualitative behaviour of cellular automata, that the computational basis for modelling this region is the Universal Turing Machine. In this paper, following a suggestion of Crutchfield, we try to show that the Turing machine model may often be too powerful as a computational model to describe the boundary of order and chaos. In particular we study the region of the first accumulation of period doubling in unimodal and bimodal maps of the interval, from the point of view of language theory. We show that in relation to the ``extended'' Chomsky hierarchy, the relevant computational model in the unimodal case is the nested stack automaton or the related indexed languages, while the bimodal case is modeled by the linear bounded automaton or the related context-sensitive languages.Comment: 1 reference corrected, 1 reference added, minor changes in body of manuscrip

    Prediction and Generation of Binary Markov Processes: Can a Finite-State Fox Catch a Markov Mouse?

    Get PDF
    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.Comment: 12 pages, 12 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/gmc.ht
    • …
    corecore