121,355 research outputs found

    Sensitivity function and entropy increase rates for z-logistic map family at the edge of chaos

    Full text link
    It is well known that, for chaotic systems, the production of relevant entropy (Boltzmann-Gibbs) is always linear and the system has strong (exponential) sensitivity to initial conditions. In recent years, various numerical results indicate that basically the same type of behavior emerges at the edge of chaos if a specific generalization of the entropy and the exponential are used. In this work, we contribute to this scenario by numerically analysing some generalized nonextensive entropies and their related exponential definitions using zz-logistic map family. We also corroborate our findings by testing them at accumulation points of different cycles.Comment: 9 pages, 2 fig

    Statistics of Infima and Stopping Times of Entropy Production and Applications to Active Molecular Processes

    Full text link
    We study the statistics of infima, stopping times and passage probabilities of entropy production in nonequilibrium steady states, and show that they are universal. We consider two examples of stopping times: first-passage times of entropy production and waiting times of stochastic processes, which are the times when a system reaches for the first time a given state. Our main results are: (i) the distribution of the global infimum of entropy production is exponential with mean equal to minus Boltzmann's constant; (ii) we find the exact expressions for the passage probabilities of entropy production to reach a given value; (iii) we derive a fluctuation theorem for stopping-time distributions of entropy production. These results have interesting implications for stochastic processes that can be discussed in simple colloidal systems and in active molecular processes. In particular, we show that the timing and statistics of discrete chemical transitions of molecular processes, such as, the steps of molecular motors, are governed by the statistics of entropy production. We also show that the extreme-value statistics of active molecular processes are governed by entropy production, for example, the infimum of entropy production of a motor can be related to the maximal excursion of a motor against the direction of an external force. Using this relation, we make predictions for the distribution of the maximum backtrack depth of RNA polymerases, which follows from our universal results for entropy-production infima.Comment: 30 pages, 13 figure

    Thermostating by Deterministic Scattering: Heat and Shear Flow

    Full text link
    We apply a recently proposed novel thermostating mechanism to an interacting many-particle system where the bulk particles are moving according to Hamiltonian dynamics. At the boundaries the system is thermalized by deterministic and time-reversible scattering. We show how this scattering mechanism can be related to stochastic boundary conditions. We subsequently simulate nonequilibrium steady states associated to thermal conduction and shear flow for a hard disk fluid. The bulk behavior of the model is studied by comparing the transport coefficients obtained from computer simulations to theoretical results. Furthermore, thermodynamic entropy production and exponential phase-space contraction rates in the stationary nonequilibrium states are calculated showing that in general these quantities do not agree.Comment: 16 pages (revtex) with 9 figures (postscript

    On nonlinear compression costs: when Shannon meets R\'enyi

    Full text link
    Shannon entropy is the shortest average codeword length a lossless compressor can achieve by encoding i.i.d. symbols. However, there are cases in which the objective is to minimize the \textit{exponential} average codeword length, i.e. when the cost of encoding/decoding scales exponentially with the length of codewords. The optimum is reached by all strategies that map each symbol xix_i generated with probability pip_i into a codeword of length D(q)(i)=logDpiqj=1Npjq\ell^{(q)}_D(i)=-\log_D\frac{p_i^q}{\sum_{j=1}^Np_j^q}. This leads to the minimum exponential average codeword length, which equals the R\'enyi, rather than Shannon, entropy of the source distribution. We generalize the established Arithmetic Coding (AC) compressor to this framework. We analytically show that our generalized algorithm provides an exponential average length which is arbitrarily close to the R\'enyi entropy, if the symbols to encode are i.i.d.. We then apply our algorithm to both simulated (i.i.d. generated) and real (a piece of Wikipedia text) datasets. While, as expected, we find that the application to i.i.d. data confirms our analytical results, we also find that, when applied to the real dataset (composed by highly correlated symbols), our algorithm is still able to significantly reduce the exponential average codeword length with respect to the classical `Shannonian' one. Moreover, we provide another justification of the use of the exponential average: namely, we show that by minimizing the exponential average length it is possible to minimize the probability that codewords exceed a certain threshold length. This relation relies on the connection between the exponential average and the cumulant generating function of the source distribution, which is in turn related to the probability of large deviations. We test and confirm our results again on both simulated and real datasets.Comment: 22 pages, 9 figure

    Time scales and exponential trends to equilibrium: Gaussian model problems

    Get PDF
    We review results on the exponential convergence of multi- dimensional Ornstein-Uhlenbeck processes and discuss related notions of characteristic timescales with concrete model systems. We focus, on the one hand, on exit time distributions and provide ecplicit expressions for the exponential rate of the distribution in the small noise limit. On the other hand, we consider relaxation timescales of the process to its equi- librium measured in terms of relative entropy and discuss the connection with exit probabilities. Along these lines, we study examples which il- lustrate specific properties of the relaxation and discuss the possibility of deriving a simulation-based, empirical definition of slow and fast de- grees of freedom which builds upon a partitioning of the relative entropy functional in conjuction with the observed relaxation behaviour
    corecore