research

ff-divergence Inequalities

Abstract

This paper develops systematic approaches to obtain ff-divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets. Functional domination is one such approach, where special emphasis is placed on finding the best possible constant upper bounding a ratio of ff-divergences. Another approach used for the derivation of bounds among ff-divergences relies on moment inequalities and the logarithmic-convexity property, which results in tight bounds on the relative entropy and Bhattacharyya distance in terms of χ2\chi^2 divergences. A rich variety of bounds are shown to hold under boundedness assumptions on the relative information. Special attention is devoted to the total variation distance and its relation to the relative information and relative entropy, including "reverse Pinsker inequalities," as well as on the EγE_\gamma divergence, which generalizes the total variation distance. Pinsker's inequality is extended for this type of ff-divergence, a result which leads to an inequality linking the relative entropy and relative information spectrum. Integral expressions of the R\'enyi divergence in terms of the relative information spectrum are derived, leading to bounds on the R\'enyi divergence in terms of either the variational distance or relative entropy.Comment: IEEE Trans. on Information Theory, vol. 62, no. 11, pp. 5973--6006, November 2016. This manuscript is identical to the journal paper, apart of some additional material which includes Sections III-C and IV-F, and three technical proof

    Similar works

    Full text

    thumbnail-image

    Available Versions