874,001 research outputs found

    Information Equation of State

    Full text link
    Landauer's principle is applied to information in the universe. Once stars began forming, the increasing proportion of matter at high stellar temperatures compensated for the expanding universe to provide a near constant information energy density. The information equation of state was close to the dark energy value, w = -1, for a wide range of redshifts, 10> z >0.8, over one half of cosmic time. A reasonable universe information bit content of only 10^87 bits is sufficient for information energy to account for all dark energy. A time varying equation of state with a direct link between dark energy and matter, and linked to star formation in particular, is clearly relevant to the cosmic coincidence problem.In answering the "Why now?" question we wonder "What next?" as we expect the information equation of state to tend towards w = 0 in the future.Comment: 10 pages, 2 figure

    Conditional entropy of ordinal patterns

    Full text link
    In this paper we investigate a quantity called conditional entropy of ordinal patterns, akin to the permutation entropy. The conditional entropy of ordinal patterns describes the average diversity of the ordinal patterns succeeding a given ordinal pattern. We observe that this quantity provides a good estimation of the Kolmogorov-Sinai entropy in many cases. In particular, the conditional entropy of ordinal patterns of a finite order coincides with the Kolmogorov-Sinai entropy for periodic dynamics and for Markov shifts over a binary alphabet. Finally, the conditional entropy of ordinal patterns is computationally simple and thus can be well applied to real-world data

    Typical and extreme entropies of long-lived isolated quantum systems

    Full text link
    In this paper, we investigate and compare two well-developed definitions of entropy relevant for describing the dynamics of isolated quantum systems: bipartite entanglement entropy and observational entropy. In a model system of interacting particles in a one-dimensional lattice, we numerically solve for the full quantum behavior of the system. We characterize the fluctuations, and find the maximal, minimal, and typical entropy of each type that the system can eventually attain through its evolution. While both entropies are low for some "special" configurations and high for more "generic" ones, there are several fundamental differences in their behavior. Observational entropy behaves in accord with classical Boltzmann entropy (e.g. equilibrium is a condition of near-maximal entropy and uniformly distributed particles, and minimal entropy is a very compact configuration). Entanglement entropy is rather different: minimal entropy "empties out" one partition while maximal entropy apportions the particles between the partitions, and neither is typical. Beyond these qualitative results, we characterize both entropies and their fluctuations in some detail as they depend on temperature, particle number, and box size.Comment: Additional comments are made in the caption of figure 10 (a). Equation 7 and a brief description are added in relation to figure

    Entropy dimension of measure preserving systems

    Full text link
    The notion of metric entropy dimension is introduced to measure the complexity of entropy zero dynamical systems. For measure preserving systems, we define entropy dimension via the dimension of entropy generating sequences. This combinatorial approach provides us with a new insight to analyze the entropy zero systems. We also define the dimension set of a system to investigate the structure of the randomness of the factors of a system. The notion of a uniform dimension in the class of entropy zero systems is introduced as a generalization of a K-system in the case of positive entropy. We investigate the joinings among entropy zero systems and prove the disjointness property among entropy zero systems using the dimension sets. Given a topological system, we compare topological entropy dimension with metric entropy dimension

    Max- relative entropy of coherence: an operational coherence measure

    Full text link
    The operational characterization of quantum coherence is the corner stone in the development of resource theory of coherence. We introduce a new coherence quantifier based on max-relative entropy. We prove that max-relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of max-relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the max-relative entropy of coherence. By introducing suitable smooth max-relative entropy of coherence, we prove that the smooth max-relative entropy of coherence provides a lower bound of one-shot coherence cost, and the max-relative entropy of coherence is equivalent to the relative entropy of coherence in asymptotic limit. Similar to max-relative entropy of coherence, min-relative entropy of coherence has also been investigated. We show that the min-relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in asymptotic limit the min-relative entropy of coherence is equivalent to the relative entropy of coherence.Comment: v2. 5+6.5 pages, no figure, close to the published version. v1. 5.5+6 pages, no figur

    A note on the connection between nonextensive entropy and hh-derivative

    Full text link
    In order to study as a whole the major part of entropy measures, we introduce a two-parameter non-extensive entropic form with respect to the \textit{h}-derivative which generalizes the conventional Newton-Leibniz calculus. This new entropy, Sh,hS_{h,h'}, is proved to describe the non-extensive systems and recover several types of the well-known non-extensive entropic expressions, such as the Tsallis entropy, the Abe entropy, the Shafee entropy, the Kaniadakis entropy and even the classical Boltzmann\,--\,Gibbs one. As a generalized entropy, its corresponding properties are also analyzed.Comment: 6 pages, 1 figur

    Generalised exponential families and associated entropy functions

    Full text link
    A generalised notion of exponential families is introduced. It is based on the variational principle, borrowed from statistical physics. It is shown that inequivalent generalised entropy functions lead to distinct generalised exponential families. The well-known result that the inequality of Cramer and Rao becomes an equality in the case of an exponential family can be generalised. However, this requires the introduction of escort probabilities.Comment: 20 page

    Entropy production in systems with long range interactions

    Full text link
    On a fine grained scale the Gibbs entropy of an isolated system remains constant throughout its dynamical evolution. This is a consequence of Liouville's theorem for Hamiltonian systems and appears to contradict the second law of thermodynamics. In reality, however, there is no problem since the thermodynamic entropy should be associated with the Boltzmann entropy, which for non-equilibrium systems is different from Gibbs entropy. The Boltzmann entropy accounts for the microstates which are not accessible from a given initial condition, but are compatible with a given macrostate. In a sense the Boltzmann entropy is a coarse grained version of the Gibbs entropy and will not decrease during the dynamical evolution of a macroscopic system. In this paper we will explore the entropy production for systems with long range interactions. Unlike for short range systems, in the thermodynamic limit, the probability density function for these systems decouples into a product of one particle distribution functions and the coarse grained entropy can be calculated explicitly. We find that the characteristic time for the entropy production scales with the number of particles as NαN^\alpha, with α>0\alpha > 0, so that in the thermodynamic limit entropy production takes an infinite amount of time
    corecore