3 research outputs found

    Infinite Divisibility of Information

    Full text link
    We study an information analogue of infinitely divisible probability distributions, where the i.i.d. sum is replaced by the joint distribution of an i.i.d. sequence. A random variable XX is called informationally infinitely divisible if, for any n≄1n\ge1, there exists an i.i.d. sequence of random variables Z1,
,ZnZ_{1},\ldots,Z_{n} that contains the same information as XX, i.e., there exists an injective function ff such that X=f(Z1,
,Zn)X=f(Z_{1},\ldots,Z_{n}). While there does not exist informationally infinitely divisible discrete random variable, we show that any discrete random variable XX has a bounded multiplicative gap to infinite divisibility, that is, if we remove the injectivity requirement on ff, then there exists i.i.d. Z1,
,ZnZ_{1},\ldots,Z_{n} and ff satisfying X=f(Z1,
,Zn)X=f(Z_{1},\ldots,Z_{n}), and the entropy satisfies H(X)/n≀H(Z1)≀1.59H(X)/n+2.43H(X)/n\le H(Z_{1})\le1.59H(X)/n+2.43. We also study a new class of discrete probability distributions, called spectral infinitely divisible distributions, where we can remove the multiplicative gap 1.591.59. Furthermore, we study the case where X=(Y1,
,Ym)X=(Y_{1},\ldots,Y_{m}) is itself an i.i.d. sequence, m≄2m\ge2, for which the multiplicative gap 1.591.59 can be replaced by 1+5(log⁥m)/m1+5\sqrt{(\log m)/m}. This means that as mm increases, (Y1,
,Ym)(Y_{1},\ldots,Y_{m}) becomes closer to being spectral infinitely divisible in a uniform manner. This can be regarded as an information analogue of Kolmogorov's uniform theorem. Applications of our result include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.Comment: 22 page

    On fine properties of mixtures with respect to concentration of measure and Sobolev type inequalities

    Get PDF
    Mixtures are convex combinations of laws. Despite this simple definition, a mixture can be far more subtle than its mixed components. For instance, mixing Gaussian laws may produce a potential with multiple deep wells. We study in the present work fine properties of mixtures with respect to concentration of measure and Sobolev type functional inequalities. We provide sharp Laplace bounds for Lipschitz functions in the case of generic mixtures, involving a transportation cost diameter of the mixed family. Additionally, our analysis of Sobolev type inequalities for two-component mixtures reveals natural relations with some kind of band isoperimetry and support constrained interpolation via mass transportation. We show that the Poincar\'e constant of a two-component mixture may remain bounded as the mixture proportion goes to 0 or 1 while the logarithmic Sobolev constant may surprisingly blow up. This counter-intuitive result is not reducible to support disconnections, and appears as a reminiscence of the variance-entropy comparison on the two-point space. As far as mixtures are concerned, the logarithmic Sobolev inequality is less stable than the Poincar\'e inequality and the sub-Gaussian concentration for Lipschitz functions. We illustrate our results on a gallery of concrete two-component mixtures. This work leads to many open questions.Comment: Corrections. To appear in Annales de l'Institut Henri Poincare (AIHP

    Entropy, compound poisson approximation, log-sobolev inequalities and measure concentration

    No full text
    The problem of approximating the distribution of a sum S n = Σ i=1n Y i of n discrete random variables Y i by a Poisson or a compound Poisson distribution arises naturally in many classical and current applications, such as statistical genetics, dynamical systems, the recurrence properties of Markov processes and reliability theory. Using information-theoretic ideas and techniques, we derive a family of new bounds for compound Poisson approximation. We take an approach similar to that of Kontoyiannis, Harremoës and Johnson (2003), and we generalize some of their Poisson approximation bounds to the compound Poisson case. Partly motivated by these results, we derive a new logarithmic Sobolev inequality for the compound Poisson measure and use it to prove measure-concentration bounds for a large class of discrete distributions. ©2004 IEEE
    corecore