1 research outputs found

    The entropy of sums and Rusza's divergence on abelian groups

    No full text
    Motivated by a series of recently discovered inequalities for the sum and difference of discrete or continuous random variables [3], [5], [9], [10], we argue that the most natural, general form of these results is in terms of a special case of a mutual information, which we call the Ruzsa divergence between two probability distributions. This can be defined for arbitrary pairs of random variables taking values in any discrete (countable) set, on R n, or in fact on any locally compact Hausdorff abelian group. We study the basic properties of the Rusza divergence and derive numerous consequences. In particular, we show that many of the inequalities in [3], [5], [9], [10] can be stated and proved in a unified way, extending their validity to the present general setting. For example, consequences of the basic properties of the Ruzsa divergence developed here include the fact that the entropies of the sum and the difference of two independent random vectors severely constrain each other, as well as entropy analogues of a number of results in additive combinatorics. Although the setting is quite general, the results are already of interest (and new) in the case of random vectors in Rn. For instance, another consequence in Rn is an entropic analogue (in the setting of log-concave distributions) of the Rogers-Shephard inequality for convex bodies. © 2013 IEEE
    corecore