496 research outputs found

    Quantum R\'enyi and ff-divergences from integral representations

    Full text link
    Smooth Csisz\'ar ff-divergences can be expressed as integrals over so-called hockey stick divergences. This motivates a natural quantum generalization in terms of quantum Hockey stick divergences, which we explore here. Using this recipe, the Kullback-Leibler divergence generalises to the Umegaki relative entropy, in the integral form recently found by Frenkel. We find that the R\'enyi divergences defined via our new quantum ff-divergences are not additive in general, but that their regularisations surprisingly yield the Petz R\'enyi divergence for α<1\alpha < 1 and the sandwiched R\'enyi divergence for α>1\alpha > 1, unifying these two important families of quantum R\'enyi divergences. Moreover, we find that the contraction coefficients for the new quantum ff divergences collapse for all ff that are operator convex, mimicking the classical behaviour and resolving some long-standing conjectures by Lesniewski and Ruskai. We derive various inequalities, including new reverse Pinsker inequalites with applications in differential privacy and also explore various other applications of the new divergences.Comment: 44 pages. v2: improved results on reverse Pinsker inequalities + minor clarification

    ff-Divergence Inequalities via Functional Domination

    Full text link
    This paper considers derivation of ff-divergence inequalities via the approach of functional domination. Bounds on an ff-divergence based on one or several other ff-divergences are introduced, dealing with pairs of probability measures defined on arbitrary alphabets. In addition, a variety of bounds are shown to hold under boundedness assumptions on the relative information. The journal paper, which includes more approaches for the derivation of f-divergence inequalities and proofs, is available on the arXiv at https://arxiv.org/abs/1508.00335, and it has been published in the IEEE Trans. on Information Theory, vol. 62, no. 11, pp. 5973-6006, November 2016.Comment: A conference paper, 5 pages. To be presented in the 2016 ICSEE International Conference on the Science of Electrical Engineering, Nov. 16--18, Eilat, Israel. See https://arxiv.org/abs/1508.00335 for the full paper version, published as a journal paper in the IEEE Trans. on Information Theory, vol. 62, no. 11, pp. 5973-6006, November 201

    On the equivalence of modes of convergence for log-concave measures

    Full text link
    An important theme in recent work in asymptotic geometric analysis is that many classical implications between different types of geometric or functional inequalities can be reversed in the presence of convexity assumptions. In this note, we explore the extent to which different notions of distance between probability measures are comparable for log-concave distributions. Our results imply that weak convergence of isotropic log-concave distributions is equivalent to convergence in total variation, and is further equivalent to convergence in relative entropy when the limit measure is Gaussian.Comment: v3: Minor tweak in exposition. To appear in GAFA seminar note

    ON THE JENSEN-SHANNON DIVERGENCE AND THE VARIATION DISTANCE FOR CATEGORICAL PROBABILITY DISTRIBUTIONS

    Get PDF
    We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.Peer reviewe

    ON THE JENSEN-SHANNON DIVERGENCE AND THE VARIATION DISTANCE FOR CATEGORICAL PROBABILITY DISTRIBUTIONS

    Get PDF
    We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence estimate as well as the asymptotic consistency of the minimum Jensen-Shannon divergence estimate. These are key properties for likelihood-free simulator-based inference.Peer reviewe
    • …
    corecore