3 research outputs found
Infinite Divisibility of Information
We study an information analogue of infinitely divisible probability
distributions, where the i.i.d. sum is replaced by the joint distribution of an
i.i.d. sequence. A random variable is called informationally infinitely
divisible if, for any , there exists an i.i.d. sequence of random
variables that contains the same information as , i.e.,
there exists an injective function such that .
While there does not exist informationally infinitely divisible discrete random
variable, we show that any discrete random variable has a bounded
multiplicative gap to infinite divisibility, that is, if we remove the
injectivity requirement on , then there exists i.i.d.
and satisfying , and the entropy satisfies
. We also study a new class of discrete
probability distributions, called spectral infinitely divisible distributions,
where we can remove the multiplicative gap . Furthermore, we study the
case where is itself an i.i.d. sequence, , for
which the multiplicative gap can be replaced by .
This means that as increases, becomes closer to
being spectral infinitely divisible in a uniform manner. This can be regarded
as an information analogue of Kolmogorov's uniform theorem. Applications of our
result include independent component analysis, distributed storage with a
secrecy constraint, and distributed random number generation.Comment: 22 page
On fine properties of mixtures with respect to concentration of measure and Sobolev type inequalities
Mixtures are convex combinations of laws. Despite this simple definition, a
mixture can be far more subtle than its mixed components. For instance, mixing
Gaussian laws may produce a potential with multiple deep wells. We study in the
present work fine properties of mixtures with respect to concentration of
measure and Sobolev type functional inequalities. We provide sharp Laplace
bounds for Lipschitz functions in the case of generic mixtures, involving a
transportation cost diameter of the mixed family. Additionally, our analysis of
Sobolev type inequalities for two-component mixtures reveals natural relations
with some kind of band isoperimetry and support constrained interpolation via
mass transportation. We show that the Poincar\'e constant of a two-component
mixture may remain bounded as the mixture proportion goes to 0 or 1 while the
logarithmic Sobolev constant may surprisingly blow up. This counter-intuitive
result is not reducible to support disconnections, and appears as a
reminiscence of the variance-entropy comparison on the two-point space. As far
as mixtures are concerned, the logarithmic Sobolev inequality is less stable
than the Poincar\'e inequality and the sub-Gaussian concentration for Lipschitz
functions. We illustrate our results on a gallery of concrete two-component
mixtures. This work leads to many open questions.Comment: Corrections. To appear in Annales de l'Institut Henri Poincare (AIHP
Entropy, compound poisson approximation, log-sobolev inequalities and measure concentration
The problem of approximating the distribution of a sum S n = Σ i=1n Y i of n discrete random variables Y i by a Poisson or a compound Poisson distribution arises naturally in many classical and current applications, such as statistical genetics, dynamical systems, the recurrence properties of Markov processes and reliability theory. Using information-theoretic ideas and techniques, we derive a family of new bounds for compound Poisson approximation. We take an approach similar to that of Kontoyiannis, Harremoës and Johnson (2003), and we generalize some of their Poisson approximation bounds to the compound Poisson case. Partly motivated by these results, we derive a new logarithmic Sobolev inequality for the compound Poisson measure and use it to prove measure-concentration bounds for a large class of discrete distributions. ©2004 IEEE