20 research outputs found
Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa,
give bounds connecting the cardinality of the sumset of two discrete sets , to the cardinalities (or the finer
structure) of the original sets . For example, the sum-difference bound of
Ruzsa states that, , where the difference set . Interpreting the differential entropy of a
continuous random variable as (the logarithm of) the size of the effective
support of , the main contribution of this paper is a series of natural
information-theoretic analogs for these results. For example, the Ruzsa
sum-difference bound becomes the new inequality, , for any pair of independent continuous random variables and .
Our results include differential-entropy versions of Ruzsa's triangle
inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the
Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version
of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a
quantitative converse to the entropy power inequality. Versions of most of
these results for the discrete entropy were recently proved by Tao,
relying heavily on a strong, functional form of the submodularity property of
. Since differential entropy is {\em not} functionally submodular, in the
continuous case many of the corresponding discrete proofs fail, in many cases
requiring substantially new proof strategies. We find that the basic property
that naturally replaces the discrete functional submodularity, is the data
processing property of mutual information.Comment: 23 page
Two remarks on generalized entropy power inequalities
This note contributes to the understanding of generalized entropy power
inequalities. Our main goal is to construct a counter-example regarding
monotonicity and entropy comparison of weighted sums of independent identically
distributed log-concave random variables. We also present a complex analogue of
a recent dependent entropy power inequality of Hao and Jog, and give a very
simple proof.Comment: arXiv:1811.00345 is split into 2 papers, with this being on
Conditional R\'enyi entropy and the relationships between R\'enyi capacities
The analogues of Arimoto's definition of conditional R\'enyi entropy and
R\'enyi mutual information are explored for abstract alphabets. These
quantities, although dependent on the reference measure, have some useful
properties similar to those known in the discrete setting. In addition to
laying out some such basic properties and the relations to R\'enyi divergences,
the relationships between the families of mutual informations defined by
Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding
capacities, are explored.Comment: 17 pages, 1 figur
Countably Infinite Multilevel Source Polarization for Non-Stationary Erasure Distributions
Polar transforms are central operations in the study of polar codes. This
paper examines polar transforms for non-stationary memoryless sources on
possibly infinite source alphabets. This is the first attempt of source
polarization analysis over infinite alphabets. The source alphabet is defined
to be a Polish group, and we handle the Ar{\i}kan-style two-by-two polar
transform based on the group. Defining erasure distributions based on the
normal subgroup structure, we give recursive formulas of the polar transform
for our proposed erasure distributions. As a result, the recursive formulas
lead to concrete examples of multilevel source polarization with countably
infinite levels when the group is locally cyclic. We derive this result via
elementary techniques in lattice theory.Comment: 12 pages, 1 figure, a short version has been accepted by the 2019
IEEE International Symposium on Information Theory (ISIT2019
Information Inequalities via Ideas from Additive Combinatorics
Ruzsa's equivalence theorem provided a framework for converting certain
families of inequalities in additive combinatorics to entropic inequalities
(which sometimes did not possess stand-alone entropic proofs). In this work, we
first establish formal equivalences between some families (different from
Ruzsa) of inequalities in additive combinatorics and entropic ones. As a first
step to further these equivalences, we establish an information-theoretic
characterization of the magnification ratio that could also be of independent
interest.Comment: 15 pages, The authors were made aware that some of the results had
been obtained earlier. The revised version acknowledges and references this
work. A conference version of this was published in the proceeding of IEEE
ISIT 2023.