1,059 research outputs found

    Information Bottlenecks, Causal States, and Statistical Relevance Bases: How to Represent Relevant Information in Memoryless Transduction

    Full text link
    Discovering relevant, but possibly hidden, variables is a key step in constructing useful and predictive theories about the natural world. This brief note explains the connections between three approaches to this problem: the recently introduced information-bottleneck method, the computational mechanics approach to inferring optimal models, and Salmon's statistical relevance basis.Comment: 3 pages, no figures, submitted to PRE as a "brief report". Revision: added an acknowledgements section originally omitted by a LaTeX bu

    Structure and Randomness of Continuous-Time Discrete-Event Processes

    Full text link
    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models---memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects ({\epsilon}-machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.Comment: 10 pages, 2 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/ctdep.ht

    Statistical Complexity of Simple 1D Spin Systems

    Full text link
    We present exact results for two complementary measures of spatial structure generated by 1D spin systems with finite-range interactions. The first, excess entropy, measures the apparent spatial memory stored in configurations. The second, statistical complexity, measures the amount of memory needed to optimally predict the chain of spin values. These statistics capture distinct properties and are different from existing thermodynamic quantities.Comment: 4 pages with 2 eps Figures. Uses RevTeX macros. Also available at http://www.santafe.edu/projects/CompMech/papers/CompMechCommun.htm

    Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel

    Full text link
    A stochastic process's statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process's cryptic order---a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost---one trades off prediction for generation complexity.Comment: 10 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/oqs.ht

    Transient radiation effects on thermocouples

    Get PDF
    Transient radiation effects on reactor thermocouple

    Reductions of Hidden Information Sources

    Full text link
    In all but special circumstances, measurements of time-dependent processes reflect internal structures and correlations only indirectly. Building predictive models of such hidden information sources requires discovering, in some way, the internal states and mechanisms. Unfortunately, there are often many possible models that are observationally equivalent. Here we show that the situation is not as arbitrary as one would think. We show that generators of hidden stochastic processes can be reduced to a minimal form and compare this reduced representation to that provided by computational mechanics--the epsilon-machine. On the way to developing deeper, measure-theoretic foundations for the latter, we introduce a new two-step reduction process. The first step (internal-event reduction) produces the smallest observationally equivalent sigma-algebra and the second (internal-state reduction) removes sigma-algebra components that are redundant for optimal prediction. For several classes of stochastic dynamical systems these reductions produce representations that are equivalent to epsilon-machines.Comment: 12 pages, 4 figures; 30 citations; Updates at http://www.santafe.edu/~cm

    Stochastic Optimal Prediction with Application to Averaged Euler Equations

    Full text link
    Optimal prediction (OP) methods compensate for a lack of resolution in the numerical solution of complex problems through the use of an invariant measure as a prior measure in the Bayesian sense. In first-order OP, unresolved information is approximated by its conditional expectation with respect to the invariant measure. In higher-order OP, unresolved information is approximated by a stochastic estimator, leading to a system of random or stochastic differential equations. We explain the ideas through a simple example, and then apply them to the solution of Averaged Euler equations in two space dimensions.Comment: 13 pages, 2 figure

    Optimizing Quantum Models of Classical Channels: The reverse Holevo problem

    Get PDF
    Given a classical channel---a stochastic map from inputs to outputs---the input can often be transformed to an intermediate variable that is informationally smaller than the input. The new channel accurately simulates the original but at a smaller transmission rate. Here, we examine this procedure when the intermediate variable is a quantum state. We determine when and how well quantum simulations of classical channels may improve upon the minimal rates of classical simulation. This inverts Holevo's original question of quantifying the capacity of quantum channels with classical resources. We also show that this problem is equivalent to another, involving the local generation of a distribution from common entanglement.Comment: 13 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/qfact.htm; substantially updated from v

    Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

    Full text link
    We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of Thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly---for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds.Comment: 13 pages, 9 figures, http://csc.ucdavis.edu/~cmg/compmech/pubs/mrd.ht
    • …
    corecore