31,698 research outputs found

    Demon Dynamics: Deterministic Chaos, the Szilard Map, and the Intelligence of Thermodynamic Systems

    Get PDF
    We introduce a deterministic chaotic system---the Szilard Map---that encapsulates the measurement, control, and erasure protocol by which Maxwellian Demons extract work from a heat reservoir. Implementing the Demon's control function in a dynamical embodiment, our construction symmetrizes Demon and thermodynamic system, allowing one to explore their functionality and recover the fundamental trade-off between the thermodynamic costs of dissipation due to measurement and due to erasure. The map's degree of chaos---captured by the Kolmogorov-Sinai entropy---is the rate of energy extraction from the heat bath. Moreover, an engine's statistical complexity quantifies the minimum necessary system memory for it to function. In this way, dynamical instability in the control protocol plays an essential and constructive role in intelligent thermodynamic systems.Comment: 5 pages, 3 figures, supplementary materials; http://csc.ucdavis.edu/~cmg/compmech/pubs/dds.ht

    Memoryless Thermodynamics? A Reply

    Full text link
    We reply to arXiv:1508.00203 `Comment on "Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets" (arXiv:1507.01537v2)'.Comment: 4 pages; http://csc.ucdavis.edu/~cmg/compmech/pubs/MerhavReply.ht

    Correlation-powered Information Engines and the Thermodynamics of Self-Correction

    Full text link
    Information engines can use structured environments as a resource to generate work by randomizing ordered inputs and leveraging the increased Shannon entropy to transfer energy from a thermal reservoir to a work reservoir. We give a broadly applicable expression for the work production of an information engine, generally modeled as a memoryful channel that communicates inputs to outputs as it interacts with an evolving environment. The expression establishes that an information engine must have more than one memory state in order to leverage input environment correlations. To emphasize this functioning, we designed an information engine powered solely by temporal correlations and not by statistical biases, as employed by previous engines. Key to this is the engine's ability to synchronize---the engine automatically returns to a desired dynamical phase when thrown into an unwanted, dissipative phase by corruptions in the input---that is, by unanticipated environmental fluctuations. This self-correcting mechanism is robust up to a critical level of corruption, beyond which the system fails to act as an engine. We give explicit analytical expressions for both work and critical corruption level and summarize engine performance via a thermodynamic-function phase diagram over engine control parameters. The results reveal a new thermodynamic mechanism based on nonergodicity that underlies error correction as it operates to support resilient engineered and biological systems.Comment: 22 pages, 13 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/tos.ht

    Accurate calculation of the solutions to the Thomas-Fermi equations

    Get PDF
    We obtain highly accurate solutions to the Thomas-Fermi equations for atoms and atoms in very strong magnetic fields. We apply the Pad\'e-Hankel method, numerical integration, power series with Pad\'e and Hermite-Pad\'e approximants and Chebyshev polynomials. Both the slope at origin and the location of the right boundary in the magnetic-field case are given with unprecedented accuracy

    Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

    Full text link
    We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of Thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly---for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds.Comment: 13 pages, 9 figures, http://csc.ucdavis.edu/~cmg/compmech/pubs/mrd.ht

    Bubble Growth in Superfluid 3-He: The Dynamics of the Curved A-B Interface

    Full text link
    We study the hydrodynamics of the A-B interface with finite curvature. The interface tension is shown to enhance both the transition velocity and the amplitudes of second sound. In addition, the magnetic signals emitted by the growing bubble are calculated, and the interaction between many growing bubbles is considered.Comment: 20 pages, 3 figures, LaTeX, ITP-UH 11/9

    Above and Beyond the Landauer Bound: Thermodynamics of Modularity

    Get PDF
    Information processing typically occurs via the composition of modular units, such as universal logic gates. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex global computations are more easily and flexibly implemented via a series of simpler, localized information processing operations which only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity---costs that arise directly from the operation of localized processing and that go beyond Landauer's dissipation bound for erasing information. Integrated computations can achieve Landauer's bound, however, when they globally coordinate the control of all of an information reservoir's degrees of freedom. Unfortunately, global correlations among the information-bearing degrees of freedom are easily lost by modular implementations. This is costly since such correlations are a thermodynamic fuel. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy, which captures these global correlations, and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform the computational task modularly. It has immediate consequences for physically embedded transducers, known as information ratchets. We show how to circumvent modularity dissipation by designing internal ratchet states that capture the global correlations and patterns in the ratchet's information reservoir. Designed in this way, information ratchets match the optimum thermodynamic efficiency of globally integrated computations.Comment: 17 pages, 9 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/idolip.ht
    corecore