23,430 research outputs found
Stochastic thermodynamics of computation
One of the major resource requirements of computers - ranging from biological
cells to human brains to high-performance (engineered) computers - is the
energy used to run them. Those costs of performing a computation have long been
a focus of research in physics, going back to the early work of Landauer. One
of the most prominent aspects of computers is that they are inherently
nonequilibrium systems. However, the early research was done when
nonequilibrium statistical physics was in its infancy, which meant the work was
formulated in terms of equilibrium statistical physics. Since then there have
been major breakthroughs in nonequilibrium statistical physics, which are
allowing us to investigate the myriad aspects of the relationship between
statistical physics and computation, extending well beyond the issue of how
much work is required to erase a bit. In this paper I review some of this
recent work on the `stochastic thermodynamics of computation'. After reviewing
the salient parts of information theory, computer science theory, and
stochastic thermodynamics, I summarize what has been learned about the entropic
costs of performing a broad range of computations, extending from bit erasure
to loop-free circuits to logically reversible circuits to information ratchets
to Turing machines. These results reveal new, challenging engineering problems
for how to design computers to have minimal thermodynamic costs. They also
allow us to start to combine computer science theory and stochastic
thermodynamics at a foundational level, thereby expanding both.Comment: 111 pages, no figures. arXiv admin note: text overlap with
arXiv:1901.0038
Information-theoretic bound on the energy cost of stochastic simulation
Physical systems are often simulated using a stochastic computation where
different final states result from identical initial states. Here, we derive
the minimum energy cost of simulating a complex data set of a general physical
system with a stochastic computation. We show that the cost is proportional to
the difference between two information-theoretic measures of complexity of the
data - the statistical complexity and the predictive information. We derive the
difference as the amount of information erased during the computation. Finally,
we illustrate the physics of information by implementing the stochastic
computation as a Gedankenexperiment of a Szilard-type engine. The results
create a new link between thermodynamics, information theory, and complexity.Comment: 5 pages, 1 figur
The free energy requirements of biological organisms; implications for evolution
Recent advances in nonequilibrium statistical physics have provided
unprecedented insight into the thermodynamics of dynamic processes. The author
recently used these advances to extend Landauer's semi-formal reasoning
concerning the thermodynamics of bit erasure, to derive the minimal free energy
required to implement an arbitrary computation. Here, I extend this analysis,
deriving the minimal free energy required by an organism to run a given
(stochastic) map from its sensor inputs to its actuator outputs. I use
this result to calculate the input-output map of an organism that
optimally trades off the free energy needed to run with the phenotypic
fitness that results from implementing . I end with a general discussion
of the limits imposed on the rate of the terrestrial biosphere's information
processing by the flux of sunlight on the Earth.Comment: 19 pages, 0 figures, presented at 2015 NIMBIoS workshop on
"Information and entropy in biological systems
Modeling of biomolecular machines in non-equilibrium steady states
Numerical computations have become a pillar of all modern quantitative
sciences. Any computation involves modeling--even if often this step is not
made explicit--and any model has to neglect details while still being
physically accurate. Equilibrium statistical mechanics guides both the
development of models and numerical methods for dynamics obeying detailed
balance. For systems driven away from thermal equilibrium such a universal
theoretical framework is missing. For a restricted class of driven systems
governed by Markov dynamics and local detailed balance, stochastic
thermodynamics has evolved to fill this gap and to provide fundamental
constraints and guiding principles. The next step is to advance stochastic
thermodynamics from simple model systems to complex systems with ten thousands
or even millions degrees of freedom. Biomolecules operating in the presence of
chemical gradients and mechanical forces are a prime example for this
challenge. In this Perspective, we give an introduction to isothermal
stochastic thermodynamics geared towards the systematic multiscale modeling of
the conformational dynamics of biomolecular and synthetic machines, and we
outline some of the open challenges.Comment: Comments are welcom
Efficient Quantum Work Reservoirs at the Nanoscale
When reformulated as a resource theory, thermodynamics can analyze system
behaviors in the single-shot regime. In this, the work required to implement
state transitions is bounded by alpha-Renyi divergences and so differs in
identifying efficient operations compared to stochastic thermodynamics. Thus, a
detailed understanding of the difference between stochastic thermodynamics and
resource-theoretic thermodynamics is needed. To this end, we study
reversibility in the single-shot regime, generalizing the two-level work
reservoirs used there to multi-level work reservoirs. This achieves
reversibility in any transition in the single-shot regime. Building on this, we
systematically explore multi-level work reservoirs in the nondissipation regime
with and without catalysts. The resource-theoretic results show that two-level
work reservoirs undershoot Landauer's bound, misleadingly implying energy
dissipation during computation. In contrast, we demonstrate that multi-level
work reservoirs achieve Landauer's bound and produce zero entropy.Comment: 17 pages, 5 figures, 6 tables;
https://csc.ucdavis.edu/~cmg/compmech/pubs/eqwratn.ht
Thermodynamics of stochastic Turing machines
In analogy to Brownian computers we explicitly show how to construct
stochastic models, which mimic the behaviour of a general purpose computer (a
Turing machine). Our models are discrete state systems obeying a Markovian
master equation, which are logically reversible and have a well-defined and
consistent thermodynamic interpretation. The resulting master equation, which
describes a simple one-step process on an enormously large state space, allows
us to thoroughly investigate the thermodynamics of computation for this
situation. Especially, in the stationary regime we can well approximate the
master equation by a simple Fokker-Planck equation in one dimension. We then
show that the entropy production rate at steady state can be made arbitrarily
small, but the total (integrated) entropy production is finite and grows
logarithmically with the number of computational steps.Comment: 13 pages incl. appendix, 3 figures and 1 table, slightly changed
version as published in PR
Second law, entropy production, and reversibility in thermodynamics of information
We present a pedagogical review of the fundamental concepts in thermodynamics
of information, by focusing on the second law of thermodynamics and the entropy
production. Especially, we discuss the relationship among thermodynamic
reversibility, logical reversibility, and heat emission in the context of the
Landauer principle and clarify that these three concepts are fundamentally
distinct to each other. We also discuss thermodynamics of measurement and
feedback control by Maxwell's demon. We clarify that the demon and the second
law are indeed consistent in the measurement and the feedback processes
individually, by including the mutual information to the entropy production.Comment: 43 pages, 10 figures. As a chapter of: G. Snider et al. (eds.),
"Energy Limits in Computation: A Review of Landauer's Principle, Theory and
Experiments
- …