16,478 research outputs found

    Integral Fluctuation Relations for Entropy Production at Stopping Times

    Full text link
    A stopping time TT is the first time when a trajectory of a stochastic process satisfies a specific criterion. In this paper, we use martingale theory to derive the integral fluctuation relation eStot(T)=1\langle e^{-S_{\rm tot}(T)}\rangle=1 for the stochastic entropy production StotS_{\rm tot} in a stationary physical system at stochastic stopping times TT. This fluctuation relation implies the law Stot(T)0\langle S_{\rm tot}(T)\rangle\geq 0, which states that it is not possible to reduce entropy on average, even by stopping a stochastic process at a stopping time, and which we call the second law of thermodynamics at stopping times. This law implies bounds on the average amount of heat and work a system can extract from its environment when stopped at a random time. Furthermore, the integral fluctuation relation implies that certain fluctuations of entropy production are universal or are bounded by universal functions. These universal properties descend from the integral fluctuation relation by selecting appropriate stopping times: for example, when TT is a first-passage time for entropy production, then we obtain a bound on the statistics of negative records of entropy production. We illustrate these results on simple models of nonequilibrium systems described by Langevin equations and reveal two interesting phenomena. First, we demonstrate that isothermal mesoscopic systems can extract on average heat from their environment when stopped at a cleverly chosen moment and the second law at stopping times provides a bound on the average extracted heat. Second, we demonstrate that the average efficiency at stopping times of an autonomous stochastic heat engines, such as Feymann's ratchet, can be larger than the Carnot efficiency and the second law of thermodynamics at stopping times provides a bound on the average efficiency at stopping times.Comment: 37 pages, 6 figure

    Measuring information-transfer delays

    Get PDF
    In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics

    The Value of Information for Populations in Varying Environments

    Full text link
    The notion of information pervades informal descriptions of biological systems, but formal treatments face the problem of defining a quantitative measure of information rooted in a concept of fitness, which is itself an elusive notion. Here, we present a model of population dynamics where this problem is amenable to a mathematical analysis. In the limit where any information about future environmental variations is common to the members of the population, our model is equivalent to known models of financial investment. In this case, the population can be interpreted as a portfolio of financial assets and previous analyses have shown that a key quantity of Shannon's communication theory, the mutual information, sets a fundamental limit on the value of information. We show that this bound can be violated when accounting for features that are irrelevant in finance but inherent to biological systems, such as the stochasticity present at the individual level. This leads us to generalize the measures of uncertainty and information usually encountered in information theory

    The Zeroth Law of Thermodynamics and Volume-Preserving Conservative Dynamics with Equilibrium Stochastic Damping

    Full text link
    We propose a mathematical formulation of the zeroth law of thermodynamics and develop a stochastic dynamical theory, with a consistent irreversible thermodynamics, for systems possessing sustained conservative stationary current in phase space while in equilibrium with a heat bath. The theory generalizes underdamped mechanical equilibrium: dx=gdt+{Dϕdt+2DdB(t)}dx=gdt+\{-D\nabla\phi dt+\sqrt{2D}dB(t)\}, with g=0\nabla\cdot g=0 and {}\{\cdots\} respectively representing phase-volume preserving dynamics and stochastic damping. The zeroth law implies stationary distribution uss(x)=eϕ(x)u^{ss}(x)=e^{-\phi(x)}. We find an orthogonality ϕg=0\nabla\phi\cdot g=0 as a hallmark of the system. Stochastic thermodynamics based on time reversal (t,ϕ,g)(t,ϕ,g)\big(t,\phi,g\big)\rightarrow\big(-t,\phi,-g\big) is formulated: entropy production ep#(t)=dF(t)/dte_p^{\#}(t)=-dF(t)/dt; generalized "heat" hd#(t)=dU(t)/dth_d^{\#}(t)=-dU(t)/dt, U(t)=Rnϕ(x)u(x,t)dxU(t)=\int_{\mathbb{R}^n} \phi(x)u(x,t)dx being "internal energy", and "free energy" F(t)=U(t)+Rnu(x,t)lnu(x,t)dxF(t)=U(t)+\int_{\mathbb{R}^n} u(x,t)\ln u(x,t)dx never increases. Entropy follows dSdt=ep#hd#\frac{dS}{dt}=e_p^{\#}-h_d^{\#}. Our formulation is shown to be consistent with an earlier theory of P. Ao. Its contradistinctions to other theories, potential-flux decomposition, stochastic Hamiltonian system with even and odd variables, Klein-Kramers equation, Freidlin-Wentzell's theory, and GENERIC, are discussed.Comment: 25 page

    A sub-resolution multiphase interstellar medium model of star formation and SNe energy feedback

    Full text link
    We present a new multi-phase sub-resolution model for star formation and feedback in SPH numerical simulations of galaxy formation. Our model, called MUPPI (MUlti-Phase Particle Integrator), describes each gas particle as a multi-phase system, with cold and hot gas phases, coexisting in pressure equilibrium, and a stellar component. Cooling of the hot tenuous gas phase feeds the cold gas phase. Stars are formed out of molecular gas with a given efficiency, which scales with the dynamical time of the cold phase. Our prescription for star formation is not based on imposing the Schmidt-Kennicutt relation, which is instead naturally produced by MUPPI. Energy from supernova explosions is deposited partly into the hot phase of the gas particles, and partly to that of neighboring particles. Mass and energy flows among the different phases of each particle are described by a set of ordinary differential equations which we explicitly integrate for each gas particle, instead of relying on equilibrium solutions. This system of equations also includes the response of the multi-phase structure to energy changes associated to the thermodynamics of the gas. We apply our model to two isolated disk galaxy simulations and two spherical cooling flows. MUPPI is able to reproduce the Schmidt-Kennicutt relation for disc galaxies. It also reproduces the basic properties of the inter-stellar medium in disc galaxies, the surface densities of cold and molecular gas, of stars and of star formation rate, the vertical velocity dispersion of cold clouds and the flows connected to the galactic fountains. Quite remarkably, MUPPI also provides efficient stellar feedback without the need to include a scheme of kinetic energy feedback. [abridged]Comment: 23 pages, 26 figures, MNRAS accepte
    corecore