43 research outputs found
Entropy and set cardinality inequalities for partition-determined functions
A new notion of partition-determined functions is introduced, and several
basic inequalities are developed for the entropy of such functions of
independent random variables, as well as for cardinalities of compound sets
obtained using these functions. Here a compound set means a set obtained by
varying each argument of a function of several variables over a set associated
with that argument, where all the sets are subsets of an appropriate algebraic
structure so that the function is well defined. On the one hand, the entropy
inequalities developed for partition-determined functions imply entropic
analogues of general inequalities of Pl\"unnecke-Ruzsa type. On the other hand,
the cardinality inequalities developed for compound sets imply several
inequalities for sumsets, including for instance a generalization of
inequalities proved by Gyarmati, Matolcsi and Ruzsa (2010). We also provide
partial progress towards a conjecture of Ruzsa (2007) for sumsets in nonabelian
groups. All proofs are elementary and rely on properly developing certain
information-theoretic inequalities.Comment: 26 pages. v2: Revised version incorporating referee feedback plus
inclusion of some additional corollaries and discussion. v3: Final version
with minor corrections. To appear in Random Structures and Algorithm
Entropy methods for sumset inequalities
In this thesis we present several analogies betweeen sumset inequalities and entropy inequalities. We offer an overview of the different results and techniques that have been developed during the last ten years, starting with a seminal paper by Ruzsa, and also studied by authors such as Bollobás, Madiman, or Tao. After an introduction to the tools from sumset theory and entropy theory, we present and prove many sumset inequalities and their entropy analogues, with a particular emphasis on Plünnecke-type results. Functional submodularity is used to prove many of these, as well as an analogue of the Balog-Szemerédi-Gowers theorem. Partition-determined functions are used to obtain many sumset inequalities analogous to some new entropic results. Their use is generalized to other contexts, such as that of projections or polynomial compound sets. Furthermore, we present a generalization of a tool introduced by Ruzsa by extending it to a much more general setting than that of sumsets. We show how it can be used to obtain many entropy inequalities in a direct and unified way, and we extend its use to more general compound sets. Finally, we show how this device may help in finding new expanders
Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa,
give bounds connecting the cardinality of the sumset of two discrete sets , to the cardinalities (or the finer
structure) of the original sets . For example, the sum-difference bound of
Ruzsa states that, , where the difference set . Interpreting the differential entropy of a
continuous random variable as (the logarithm of) the size of the effective
support of , the main contribution of this paper is a series of natural
information-theoretic analogs for these results. For example, the Ruzsa
sum-difference bound becomes the new inequality, , for any pair of independent continuous random variables and .
Our results include differential-entropy versions of Ruzsa's triangle
inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the
Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version
of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a
quantitative converse to the entropy power inequality. Versions of most of
these results for the discrete entropy were recently proved by Tao,
relying heavily on a strong, functional form of the submodularity property of
. Since differential entropy is {\em not} functionally submodular, in the
continuous case many of the corresponding discrete proofs fail, in many cases
requiring substantially new proof strategies. We find that the basic property
that naturally replaces the discrete functional submodularity, is the data
processing property of mutual information.Comment: 23 page
The convexification effect of Minkowski summation
Let us define for a compact set the sequence It was independently proved by Shapley, Folkman and Starr (1969)
and by Emerson and Greenleaf (1969) that approaches the convex hull of
in the Hausdorff distance induced by the Euclidean norm as goes to
. We explore in this survey how exactly approaches the convex
hull of , and more generally, how a Minkowski sum of possibly different
compact sets approaches convexity, as measured by various indices of
non-convexity. The non-convexity indices considered include the Hausdorff
distance induced by any norm on , the volume deficit (the
difference of volumes), a non-convexity index introduced by Schneider (1975),
and the effective standard deviation or inner radius. After first clarifying
the interrelationships between these various indices of non-convexity, which
were previously either unknown or scattered in the literature, we show that the
volume deficit of does not monotonically decrease to 0 in dimension 12
or above, thus falsifying a conjecture of Bobkov et al. (2011), even though
their conjecture is proved to be true in dimension 1 and for certain sets
with special structure. On the other hand, Schneider's index possesses a strong
monotonicity property along the sequence , and both the Hausdorff
distance and effective standard deviation are eventually monotone (once
exceeds ). Along the way, we obtain new inequalities for the volume of the
Minkowski sum of compact sets, falsify a conjecture of Dyn and Farkhi (2004),
demonstrate applications of our results to combinatorial discrepancy theory,
and suggest some questions worthy of further investigation.Comment: 60 pages, 7 figures. v2: Title changed. v3: Added Section 7.2
resolving Dyn-Farkhi conjectur
Conditional R\'enyi entropy and the relationships between R\'enyi capacities
The analogues of Arimoto's definition of conditional R\'enyi entropy and
R\'enyi mutual information are explored for abstract alphabets. These
quantities, although dependent on the reference measure, have some useful
properties similar to those known in the discrete setting. In addition to
laying out some such basic properties and the relations to R\'enyi divergences,
the relationships between the families of mutual informations defined by
Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding
capacities, are explored.Comment: 17 pages, 1 figur