16,478 research outputs found
Integral Fluctuation Relations for Entropy Production at Stopping Times
A stopping time is the first time when a trajectory of a stochastic
process satisfies a specific criterion. In this paper, we use martingale theory
to derive the integral fluctuation relation for the stochastic entropy production in a
stationary physical system at stochastic stopping times . This fluctuation
relation implies the law , which states
that it is not possible to reduce entropy on average, even by stopping a
stochastic process at a stopping time, and which we call the second law of
thermodynamics at stopping times. This law implies bounds on the average amount
of heat and work a system can extract from its environment when stopped at a
random time. Furthermore, the integral fluctuation relation implies that
certain fluctuations of entropy production are universal or are bounded by
universal functions. These universal properties descend from the integral
fluctuation relation by selecting appropriate stopping times: for example, when
is a first-passage time for entropy production, then we obtain a bound on
the statistics of negative records of entropy production. We illustrate these
results on simple models of nonequilibrium systems described by Langevin
equations and reveal two interesting phenomena. First, we demonstrate that
isothermal mesoscopic systems can extract on average heat from their
environment when stopped at a cleverly chosen moment and the second law at
stopping times provides a bound on the average extracted heat. Second, we
demonstrate that the average efficiency at stopping times of an autonomous
stochastic heat engines, such as Feymann's ratchet, can be larger than the
Carnot efficiency and the second law of thermodynamics at stopping times
provides a bound on the average efficiency at stopping times.Comment: 37 pages, 6 figure
Measuring information-transfer delays
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics
The Value of Information for Populations in Varying Environments
The notion of information pervades informal descriptions of biological
systems, but formal treatments face the problem of defining a quantitative
measure of information rooted in a concept of fitness, which is itself an
elusive notion. Here, we present a model of population dynamics where this
problem is amenable to a mathematical analysis. In the limit where any
information about future environmental variations is common to the members of
the population, our model is equivalent to known models of financial
investment. In this case, the population can be interpreted as a portfolio of
financial assets and previous analyses have shown that a key quantity of
Shannon's communication theory, the mutual information, sets a fundamental
limit on the value of information. We show that this bound can be violated when
accounting for features that are irrelevant in finance but inherent to
biological systems, such as the stochasticity present at the individual level.
This leads us to generalize the measures of uncertainty and information usually
encountered in information theory
The Zeroth Law of Thermodynamics and Volume-Preserving Conservative Dynamics with Equilibrium Stochastic Damping
We propose a mathematical formulation of the zeroth law of thermodynamics and
develop a stochastic dynamical theory, with a consistent irreversible
thermodynamics, for systems possessing sustained conservative stationary
current in phase space while in equilibrium with a heat bath. The theory
generalizes underdamped mechanical equilibrium: , with and respectively
representing phase-volume preserving dynamics and stochastic damping. The
zeroth law implies stationary distribution . We find an
orthogonality as a hallmark of the system. Stochastic
thermodynamics based on time reversal
is formulated: entropy
production ; generalized "heat" ,
being "internal energy", and "free
energy" never increases.
Entropy follows . Our formulation is shown to
be consistent with an earlier theory of P. Ao. Its contradistinctions to other
theories, potential-flux decomposition, stochastic Hamiltonian system with even
and odd variables, Klein-Kramers equation, Freidlin-Wentzell's theory, and
GENERIC, are discussed.Comment: 25 page
A sub-resolution multiphase interstellar medium model of star formation and SNe energy feedback
We present a new multi-phase sub-resolution model for star formation and
feedback in SPH numerical simulations of galaxy formation. Our model, called
MUPPI (MUlti-Phase Particle Integrator), describes each gas particle as a
multi-phase system, with cold and hot gas phases, coexisting in pressure
equilibrium, and a stellar component. Cooling of the hot tenuous gas phase
feeds the cold gas phase. Stars are formed out of molecular gas with a given
efficiency, which scales with the dynamical time of the cold phase. Our
prescription for star formation is not based on imposing the Schmidt-Kennicutt
relation, which is instead naturally produced by MUPPI. Energy from supernova
explosions is deposited partly into the hot phase of the gas particles, and
partly to that of neighboring particles. Mass and energy flows among the
different phases of each particle are described by a set of ordinary
differential equations which we explicitly integrate for each gas particle,
instead of relying on equilibrium solutions. This system of equations also
includes the response of the multi-phase structure to energy changes associated
to the thermodynamics of the gas. We apply our model to two isolated disk
galaxy simulations and two spherical cooling flows. MUPPI is able to reproduce
the Schmidt-Kennicutt relation for disc galaxies. It also reproduces the basic
properties of the inter-stellar medium in disc galaxies, the surface densities
of cold and molecular gas, of stars and of star formation rate, the vertical
velocity dispersion of cold clouds and the flows connected to the galactic
fountains. Quite remarkably, MUPPI also provides efficient stellar feedback
without the need to include a scheme of kinetic energy feedback. [abridged]Comment: 23 pages, 26 figures, MNRAS accepte
- …