61 research outputs found

    Stochastic time-evolution, information geometry and the Cramer-Rao Bound

    Get PDF
    We investigate the connection between the time-evolution of averages of stochastic quantities and the Fisher information and its induced statistical length. As a consequence of the Cramer-Rao bound, we find that the rate of change of the average of any observable is bounded from above by its variance times the temporal Fisher information. As a consequence of this bound, we obtain a speed limit on the evolution of stochastic observables: Changing the average of an observable requires a minimum amount of time given by the change in the average squared, divided by the fluctuations of the observable times the thermodynamic cost of the transformation. In particular for relaxation dynamics, which do not depend on time explicitly, we show that the Fisher information is a monotonically decreasing function of time and that this minimal required time is determined by the initial preparation of the system. We further show that the monotonicity of the Fisher information can be used to detect hidden variables in the system and demonstrate our findings for simple examples of continuous and discrete random processes.Comment: 25 pages, 4 figure

    Upper bounds on entropy production in diffusive dynamics

    Full text link
    Based on a variational expression for the steady-state entropy production rate in overdamped Langevin dynamics, we derive concrete upper bounds on the entropy production rate in various physical settings. For particles in a thermal environment and driven by non-conservative forces, we show that the entropy production rate can be upper bounded by considering only the statistics of the driven particles. We use this finding to argue that the presence of non-driven, passive degrees of freedom generally leads to decreased dissipation. Another upper bound can be obtained only in terms of the variance of the non-conservative force, which leads to a universal upper bound for particles that are driven by a constant force that is applied in a certain region of space. Extending our results to systems attached to multiple heat baths or with spatially varying temperature and/or mobility, we show that the temperature difference between the heat baths or the gradient of the temperature can be used to upper bound the entropy production rate. We show that most of these results extend in a straightforward way to underdamped Langevin dynamics and demonstrate them in three concrete examples.Comment: 16 pages, 4 figure

    Thermodynamic constraints on the power spectral density in and out of equilibrium

    Full text link
    The power spectral density of an observable quantifies the amount of fluctuations at a given frequency and can reveal the influence of different timescales on the observable's dynamics. Here, we show that the spectral density in a continuous-time Markov process can be both lower and upper bounded by an expression involving two constants that depend on the observable and the properties of the system. In equilibrium, we identify these constants with the low- and high-frequency limit of the spectral density, respectively; thus, the spectrum at arbitrary frequency is bounded by the short- and long-time behavior of the observable. Out of equilibrium, on the other hand, the constants can no longer be identified with the limiting behavior of the spectrum, allowing for peaks that correspond to oscillations in the dynamics. We show that the height of these peaks is related to dissipation, allowing to infer the degree to which the system is out of equilibrium from the measured spectrum.Comment: 13 pages, 4 figure
    corecore