1 research outputs found

    Entropy of Independent Experiments, Revisited

    Full text link
    The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we characterize sources with \emph{slowest convergence} (for given entropy): their outputs are mixtures of a uniform distribution and a unit mass. (b) based on the above characterization, we establish faster convergences in \emph{high-entropy} regimes. We discuss how these improved bounds may be used to better quantify security of outputs of random number generators. In turn, the characterization of "worst" distributions can be used to derive sharp "extremal" inequalities between Renyi and Shannon entropy. The main technique is \emph{non-convex programming}, used to characterize distributions of possibly large exponential moments under certain entropy
    corecore