930 research outputs found

    A measure of statistical complexity based on predictive information with application to finite spin systems

    Get PDF
    NOTICE: this is the author’s version of a work that was accepted for publication in 'Physical Letters A'. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in PHYSICAL LETTERS A, 376 (4): 275-281, JAN 2012. DOI:10.1016/j.physleta.2011.10.066

    Exact Synchronization for Finite-State Sources

    Full text link
    We analyze how an observer synchronizes to the internal state of a finite-state information source, using the epsilon-machine causal representation. Here, we treat the case of exact synchronization, when it is possible for the observer to synchronize completely after a finite number of observations. The more difficult case of strictly asymptotic synchronization is treated in a sequel. In both cases, we find that an observer, on average, will synchronize to the source state exponentially fast and that, as a result, the average accuracy in an observer's predictions of the source output approaches its optimal level exponentially fast as well. Additionally, we show here how to analytically calculate the synchronization rate for exact epsilon-machines and provide an efficient polynomial-time algorithm to test epsilon-machines for exactness.Comment: 9 pages, 6 figures; now includes analytical calculation of the synchronization rate; updates and corrections adde

    Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

    Full text link
    We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of Thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly---for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds.Comment: 13 pages, 9 figures, http://csc.ucdavis.edu/~cmg/compmech/pubs/mrd.ht

    A Physics-Based Approach to Unsupervised Discovery of Coherent Structures in Spatiotemporal Systems

    Full text link
    Given that observational and numerical climate data are being produced at ever more prodigious rates, increasingly sophisticated and automated analysis techniques have become essential. Deep learning is quickly becoming a standard approach for such analyses and, while great progress is being made, major challenges remain. Unlike commercial applications in which deep learning has led to surprising successes, scientific data is highly complex and typically unlabeled. Moreover, interpretability and detecting new mechanisms are key to scientific discovery. To enhance discovery we present a complementary physics-based, data-driven approach that exploits the causal nature of spatiotemporal data sets generated by local dynamics (e.g. hydrodynamic flows). We illustrate how novel patterns and coherent structures can be discovered in cellular automata and outline the path from them to climate data.Comment: 4 pages, 1 figure; http://csc.ucdavis.edu/~cmg/compmech/pubs/ci2017_Rupe_et_al.ht

    Shortcuts to Thermodynamic Computing: The Cost of Fast and Faithful Erasure

    Get PDF
    Landauer's Principle states that the energy cost of information processing must exceed the product of the temperature and the change in Shannon entropy of the information-bearing degrees of freedom. However, this lower bound is achievable only for quasistatic, near-equilibrium computations -- that is, only over infinite time. In practice, information processing takes place in finite time, resulting in dissipation and potentially unreliable logical outcomes. For overdamped Langevin dynamics, we show that counterdiabatic potentials can be crafted to guide systems rapidly and accurately along desired computational paths, providing shortcuts that allows for the precise design of finite-time computations. Such shortcuts require additional work, beyond Landauer's bound, that is irretrievably dissipated into the environment. We show that this dissipated work is proportional to the computation rate as well as the square of the information-storing system's length scale. As a paradigmatic example, we design shortcuts to erase a bit of information metastably stored in a double-well potential. Though dissipated work generally increases with erasure fidelity, we show that it is possible perform perfect erasure in finite time with finite work. We also show that the robustness of information storage affects the energetic cost of erasure---specifically, the dissipated work scales as the information lifetime of the bistable system. Our analysis exposes a rich and nuanced relationship between work, speed, size of the information-bearing degrees of freedom, storage robustness, and the difference between initial and final informational statistics.Comment: 19 pages, 7 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/scte.ht

    Water and the National Welfare—Programs in Search of a Policy

    Get PDF
    It is no secret to residents of the western states that water is a matter of primary public concern. Land and water policies are deeply imbedded in the region, and the imprint of federal water projects on the economic geography of the West is plain to see. It is increasingly clear, however, that no coherent national policy, past or present, has emerged from the massive federal effort in the field. There is no lack of interest, planning, and expenditure on the supply and quality of water, and much progress has been made in definition and measurement of the factors that determine an efficient water system. But sound principles are still honored as much in the breach as in the observance, and we still speak with a thousand voices on any water problem of real magnitude. The time is at hand when the plethora of overlapping and frequently quarrelsome federal agencies concerned with the development and allocation of water supplies and the protection of water quality must be subjected to the test of clearly formulated national objectives and of conceptually sound and consistent means of achieving them

    Kneese, Allen V., The Economics of Regional Water Quality Management

    Get PDF

    Management of the North Pacific Fisheries: Economic Objectives and Issues

    Get PDF
    In this paper, we attempt to narrow the areas of conflict by specifying more precisely the objectives of fishery utilization (and, inferentially, of fisheries management) in the North Pacific, and by analysis of the extent to which the optimal combination of regulatory measures in a theoretical framework must be modified to accommodate the technological, administrative, and political complexities that beset an international fishery. The basic bioeconomic theory of an ocean fishery is modified to show its application to a typical case involving interdependent exploited species and international differences in market prices of both inputs and end products. The analysis is then cast in terms of the specific situation in the North Pacific. Alternative concepts of international regulation are examined from the standpoint of their economic repercussions, and recommendations are formulated for a longrun management program designed to yield continuing economic benefits as well as physical protection of the resources. Attention is centered on the Northeast Pacific, where the four major fishing powers are all actively engaged and in direct competition. The emphasis throughout is on what should be attempted rather than on what can be accomplished under present institutional and legal arrangements

    Reductions of Hidden Information Sources

    Full text link
    In all but special circumstances, measurements of time-dependent processes reflect internal structures and correlations only indirectly. Building predictive models of such hidden information sources requires discovering, in some way, the internal states and mechanisms. Unfortunately, there are often many possible models that are observationally equivalent. Here we show that the situation is not as arbitrary as one would think. We show that generators of hidden stochastic processes can be reduced to a minimal form and compare this reduced representation to that provided by computational mechanics--the epsilon-machine. On the way to developing deeper, measure-theoretic foundations for the latter, we introduce a new two-step reduction process. The first step (internal-event reduction) produces the smallest observationally equivalent sigma-algebra and the second (internal-state reduction) removes sigma-algebra components that are redundant for optimal prediction. For several classes of stochastic dynamical systems these reductions produce representations that are equivalent to epsilon-machines.Comment: 12 pages, 4 figures; 30 citations; Updates at http://www.santafe.edu/~cm

    Management of the North Pacific Fisheries: Economic Objectives and Issues

    Get PDF
    In this paper, we attempt to narrow the areas of conflict by specifying more precisely the objectives of fishery utilization (and, inferentially, of fisheries management) in the North Pacific, and by analysis of the extent to which the optimal combination of regulatory measures in a theoretical framework must be modified to accommodate the technological, administrative, and political complexities that beset an international fishery. The basic bioeconomic theory of an ocean fishery is modified to show its application to a typical case involving interdependent exploited species and international differences in market prices of both inputs and end products. The analysis is then cast in terms of the specific situation in the North Pacific. Alternative concepts of international regulation are examined from the standpoint of their economic repercussions, and recommendations are formulated for a longrun management program designed to yield continuing economic benefits as well as physical protection of the resources. Attention is centered on the Northeast Pacific, where the four major fishing powers are all actively engaged and in direct competition. The emphasis throughout is on what should be attempted rather than on what can be accomplished under present institutional and legal arrangements
    • …
    corecore