43 research outputs found

    Entropy and set cardinality inequalities for partition-determined functions

    Full text link
    A new notion of partition-determined functions is introduced, and several basic inequalities are developed for the entropy of such functions of independent random variables, as well as for cardinalities of compound sets obtained using these functions. Here a compound set means a set obtained by varying each argument of a function of several variables over a set associated with that argument, where all the sets are subsets of an appropriate algebraic structure so that the function is well defined. On the one hand, the entropy inequalities developed for partition-determined functions imply entropic analogues of general inequalities of Pl\"unnecke-Ruzsa type. On the other hand, the cardinality inequalities developed for compound sets imply several inequalities for sumsets, including for instance a generalization of inequalities proved by Gyarmati, Matolcsi and Ruzsa (2010). We also provide partial progress towards a conjecture of Ruzsa (2007) for sumsets in nonabelian groups. All proofs are elementary and rely on properly developing certain information-theoretic inequalities.Comment: 26 pages. v2: Revised version incorporating referee feedback plus inclusion of some additional corollaries and discussion. v3: Final version with minor corrections. To appear in Random Structures and Algorithm

    Entropy methods for sumset inequalities

    Get PDF
    In this thesis we present several analogies betweeen sumset inequalities and entropy inequalities. We offer an overview of the different results and techniques that have been developed during the last ten years, starting with a seminal paper by Ruzsa, and also studied by authors such as Bollobás, Madiman, or Tao. After an introduction to the tools from sumset theory and entropy theory, we present and prove many sumset inequalities and their entropy analogues, with a particular emphasis on Plünnecke-type results. Functional submodularity is used to prove many of these, as well as an analogue of the Balog-Szemerédi-Gowers theorem. Partition-determined functions are used to obtain many sumset inequalities analogous to some new entropic results. Their use is generalized to other contexts, such as that of projections or polynomial compound sets. Furthermore, we present a generalization of a tool introduced by Ruzsa by extending it to a much more general setting than that of sumsets. We show how it can be used to obtain many entropy inequalities in a direct and unified way, and we extend its use to more general compound sets. Finally, we show how this device may help in finding new expanders

    Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

    Get PDF
    The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa, give bounds connecting the cardinality of the sumset A+B={a+b  ;  aA,bB}A+B=\{a+b\;;\;a\in A,\,b\in B\} of two discrete sets A,BA,B, to the cardinalities (or the finer structure) of the original sets A,BA,B. For example, the sum-difference bound of Ruzsa states that, A+BABAB3|A+B|\,|A|\,|B|\leq|A-B|^3, where the difference set AB={ab  ;  aA,bB}A-B= \{a-b\;;\;a\in A,\,b\in B\}. Interpreting the differential entropy h(X)h(X) of a continuous random variable XX as (the logarithm of) the size of the effective support of XX, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X+Y)+h(X)+h(Y)3h(XY)h(X+Y)+h(X)+h(Y)\leq 3h(X-Y), for any pair of independent continuous random variables XX and YY. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X)H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X)H(X). Since differential entropy is {\em not} functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.Comment: 23 page

    The convexification effect of Minkowski summation

    Full text link
    Let us define for a compact set ARnA \subset \mathbb{R}^n the sequence A(k)={a1++akk:a1,,akA}=1k(A++Ak times). A(k) = \left\{\frac{a_1+\cdots +a_k}{k}: a_1, \ldots, a_k\in A\right\}=\frac{1}{k}\Big(\underset{k\ {\rm times}}{\underbrace{A + \cdots + A}}\Big). It was independently proved by Shapley, Folkman and Starr (1969) and by Emerson and Greenleaf (1969) that A(k)A(k) approaches the convex hull of AA in the Hausdorff distance induced by the Euclidean norm as kk goes to \infty. We explore in this survey how exactly A(k)A(k) approaches the convex hull of AA, and more generally, how a Minkowski sum of possibly different compact sets approaches convexity, as measured by various indices of non-convexity. The non-convexity indices considered include the Hausdorff distance induced by any norm on Rn\mathbb{R}^n, the volume deficit (the difference of volumes), a non-convexity index introduced by Schneider (1975), and the effective standard deviation or inner radius. After first clarifying the interrelationships between these various indices of non-convexity, which were previously either unknown or scattered in the literature, we show that the volume deficit of A(k)A(k) does not monotonically decrease to 0 in dimension 12 or above, thus falsifying a conjecture of Bobkov et al. (2011), even though their conjecture is proved to be true in dimension 1 and for certain sets AA with special structure. On the other hand, Schneider's index possesses a strong monotonicity property along the sequence A(k)A(k), and both the Hausdorff distance and effective standard deviation are eventually monotone (once kk exceeds nn). Along the way, we obtain new inequalities for the volume of the Minkowski sum of compact sets, falsify a conjecture of Dyn and Farkhi (2004), demonstrate applications of our results to combinatorial discrepancy theory, and suggest some questions worthy of further investigation.Comment: 60 pages, 7 figures. v2: Title changed. v3: Added Section 7.2 resolving Dyn-Farkhi conjectur

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur
    corecore