49 research outputs found

    Thermodynamic cost of external control

    Full text link
    Artificial molecular machines are often driven by the periodic variation of an external parameter. This external control exerts work on the system of which a part can be extracted as output if the system runs against an applied load. Usually, the thermodynamic cost of the process that generates the external control is ignored. Here, we derive a refined second law for such small machines that include this cost, which is, for example, generated by free energy consumption of a chemical reaction that modifies the energy landscape for such a machine. In the limit of irreversible control, this refined second law becomes the standard one. Beyond this ideal limiting case, our analysis shows that due to a new entropic term unexpected regimes can occur: The control work can be smaller than the extracted work and the work required to generate the control can be smaller than this control work. Our general inequalities are illustrated by a paradigmatic three-state system.Comment: 11 pages, 3 figure

    Coherence of Biochemical Oscillations is Bounded by Driving Force and Network Topology

    Full text link
    Biochemical oscillations are prevalent in living organisms. Systems with a small number of constituents cannot sustain coherent oscillations for an indefinite time because of fluctuations in the period of oscillation. We show that the number of coherent oscillations that quantifies the precision of the oscillator is universally bounded by the thermodynamic force that drives the system out of equilibrium and by the topology of the underlying biochemical network of states. Our results are valid for arbitrary Markov processes, which are commonly used to model biochemical reactions. We apply our results to a model for a single KaiC protein and to an activator-inhibitor model that consists of several molecules. From a mathematical perspective, based on strong numerical evidence, we conjecture a universal constraint relating the imaginary and real parts of the first non-trivial eigenvalue of a stochastic matrix.Comment: 12 pages, 13 figure

    Dispersion of the time spent in a state: General expression for unicyclic model and dissipation-less precision

    Full text link
    We compare the relation between dispersion and dissipation for two random variables that can be used to characterize the precision of a Brownian clock. The first random variable is the current between states. In this case, a certain precision requires a minimal energetic cost determined by a known thermodynamic uncertainty relation. We introduce a second random variable that is a certain linear combination of two random variables, each of which is the time a stochastic trajectory spends in a state. Whereas the first moment of this random variable is equal to the average probability current, its dispersion is generally different from the dispersion associated with the current. Remarkably, for this second random variable a certain precision can be obtained with an arbitrarily low energy dissipation, in contrast to the thermodynamic uncertainty relation for the current. As a main technical achievement, we provide an exact expression for the dispersion related to the time that a stochastic trajectory spends in a cluster of states for a general unicyclic network.Comment: 17 pages, 1 figur

    Skewness and Kurtosis in Statistical Kinetics

    Full text link
    We obtain lower and upper bounds on the skewness and kurtosis associated with the cycle completion time of unicyclic enzymatic reaction schemes. Analogous to a well known lower bound on the randomness parameter, the lower bounds on skewness and kurtosis are related to the number of intermediate states in the underlying chemical reaction network. Our results demonstrate that evaluating these higher order moments with single molecule data can lead to information about the enzymatic scheme that is not contained in the randomness parameter.Comment: 5+3 pages, 4 figure

    Sensory capacity: an information theoretical measure of the performance of a sensor

    Full text link
    For a general sensory system following an external stochastic signal, we introduce the sensory capacity. This quantity characterizes the performance of a sensor: sensory capacity is maximal if the instantaneous state of the sensor has as much information about a signal as the whole time-series of the sensor. We show that adding a memory to the sensor increases the sensory capacity. This increase quantifies the improvement of the sensor with the addition of the memory. Our results are obtained with the framework of stochastic thermodynamics of bipartite systems, which allows for the definition of an efficiency that relates the rate with which the sensor learns about the signal with the energy dissipated by the sensor, which is given by the thermodynamic entropy production. We demonstrate a general tradeoff between sensory capacity and efficiency: if the sensory capacity is equal to its maximum 1, then the efficiency must be less than 1/2. As a physical realization of a sensor we consider a two component cellular network estimating a fluctuating external ligand concentration as signal. This model leads to coupled linear Langevin equations that allow us to obtain explicit analytical results.Comment: 15 pages, 7 figure
    corecore