61,919 research outputs found

    Entropy accumulation

    Full text link
    We ask the question whether entropy accumulates, in the sense that the operationally relevant total uncertainty about an nn-partite system A=(A1,…An)A = (A_1, \ldots A_n) corresponds to the sum of the entropies of its parts AiA_i. The Asymptotic Equipartition Property implies that this is indeed the case to first order in nn, under the assumption that the parts AiA_i are identical and independent of each other. Here we show that entropy accumulation occurs more generally, i.e., without an independence assumption, provided one quantifies the uncertainty about the individual systems AiA_i by the von Neumann entropy of suitably chosen conditional states. The analysis of a large system can hence be reduced to the study of its parts. This is relevant for applications. In device-independent cryptography, for instance, the approach yields essentially optimal security bounds valid for general attacks, as shown by Arnon-Friedman et al.Comment: 44 pages; expandable to 48 page

    Orders of accumulation of entropy

    Full text link
    For a continuous map TT of a compact metrizable space XX with finite topological entropy, the order of accumulation of entropy of TT is a countable ordinal that arises in the context of entropy structure and symbolic extensions. We show that every countable ordinal is realized as the order of accumulation of some dynamical system. Our proof relies on functional analysis of metrizable Choquet simplices and a realization theorem of Downarowicz and Serafin. Further, if MM is a metrizable Choquet simplex, we bound the ordinals that appear as the order of accumulation of entropy of a dynamical system whose simplex of invariant measures is affinely homeomorphic to MM. These bounds are given in terms of the Cantor-Bendixson rank of \overline{\ex(M)}, the closure of the extreme points of MM, and the relative Cantor-Bendixson rank of \overline{\ex(M)} with respect to \ex(M). We also address the optimality of these bounds.Comment: 48 page

    Renormalized entropy for one dimensional discrete maps: periodic and quasi-periodic route to chaos and their robustness

    Full text link
    We apply renormalized entropy as a complexity measure to the logistic and sine-circle maps. In the case of logistic map, renormalized entropy decreases (increases) until the accumulation point (after the accumulation point up to the most chaotic state) as a sign of increasing (decreasing) degree of order in all the investigated periodic windows, namely, period-2, 3, and 5, thereby proving the robustness of this complexity measure. This observed change in the renormalized entropy is adequate, since the bifurcations are exhibited before the accumulation point, after which the band-merging, in opposition to the bifurcations, is exhibited. In addition to the precise detection of the accumulation points in all these windows, it is shown that the renormalized entropy can detect the self-similar windows in the chaotic regime by exhibiting abrupt changes in its values. Regarding the sine-circle map, we observe that the renormalized entropy detects also the quasi-periodic regimes by showing oscillatory behavior particularly in these regimes. Moreover, the oscillatory regime of the renormalized entropy corresponds to a larger interval of the nonlinearity parameter of the sine-circle map as the value of the frequency ratio parameter reaches the critical value, at which the winding ratio attains the golden mean.Comment: 14 pages, 7 figure

    Generalised entropy accumulation

    Full text link
    Consider a sequential process in which each step outputs a system AiA_i and updates a side information register EE. We prove that if this process satisfies a natural "non-signalling" condition between past outputs and future side information, the min-entropy of the outputs A1,…,AnA_1, \dots, A_n conditioned on the side information EE at the end of the process can be bounded from below by a sum of von Neumann entropies associated with the individual steps. This is a generalisation of the entropy accumulation theorem (EAT), which deals with a more restrictive model of side information: there, past side information cannot be updated in subsequent rounds, and newly generated side information has to satisfy a Markov condition. Due to its more general model of side-information, our generalised EAT can be applied more easily and to a broader range of cryptographic protocols. As examples, we give the first multi-round security proof for blind randomness expansion and a simplified analysis of the E91 QKD protocol. The proof of our generalised EAT relies on a new variant of Uhlmann's theorem and new chain rules for the Renyi divergence and entropy, which might be of independent interest.Comment: 42 pages; v2 expands introduction but does not change any results; in FOCS 202

    Sensitivity function and entropy increase rates for z-logistic map family at the edge of chaos

    Full text link
    It is well known that, for chaotic systems, the production of relevant entropy (Boltzmann-Gibbs) is always linear and the system has strong (exponential) sensitivity to initial conditions. In recent years, various numerical results indicate that basically the same type of behavior emerges at the edge of chaos if a specific generalization of the entropy and the exponential are used. In this work, we contribute to this scenario by numerically analysing some generalized nonextensive entropies and their related exponential definitions using zz-logistic map family. We also corroborate our findings by testing them at accumulation points of different cycles.Comment: 9 pages, 2 fig
    • …
    corecore