10,600 research outputs found

    A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.Comment: 5 pages, accepted for presentation at the IEEE International Symposium on Information Theory 200

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    A multivariate generalization of Costa's entropy power inequality

    Full text link
    A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    Yet Another Proof of the Entropy Power Inequality

    Full text link
    Yet another simple proof of the entropy power inequality is given, which avoids both the integration over a path of Gaussian perturbation and the use of Young's inequality with sharp constant or R\'enyi entropies. The proof is based on a simple change of variables, is formally identical in one and several dimensions, and easily settles the equality case

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur

    Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels

    Full text link
    Within the framework of linear vector Gaussian channels with arbitrary signaling, closed-form expressions for the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, closed-form expressions for the Hessian of the mutual information and the differential entropy are derived. These expressions are then used to assess the concavity properties of mutual information and differential entropy under different channel conditions and also to derive a multivariate version of the entropy power inequality due to Costa.Comment: 33 pages, 2 figures. A shorter version of this paper is to appear in IEEE Transactions on Information Theor

    Bounds on Information Combining With Quantum Side Information

    Full text link
    "Bounds on information combining" are entropic inequalities that determine how the information (entropy) of a set of random variables can change when these are combined in certain prescribed ways. Such bounds play an important role in classical information theory, particularly in coding and Shannon theory; entropy power inequalities are special instances of them. The arguably most elementary kind of information combining is the addition of two binary random variables (a CNOT gate), and the resulting quantities play an important role in Belief propagation and Polar coding. We investigate this problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities. Our main technical result is a non-trivial, and close to optimal, lower bound on the combined entropy, which can be seen as an almost optimal "quantum Mrs. Gerber's Lemma". Our proof uses three main ingredients: (1) a new bound on the concavity of von Neumann entropy, which is tight in the regime of low pairwise state fidelities; (2) the quantitative improvement of strong subadditivity due to Fawzi-Renner, in which we manage to handle the minimization over recovery maps; (3) recent duality results on classical-quantum-channels due to Renes et al. We furthermore present conjectures on the optimal lower and upper bounds under quantum side information, supported by interesting analytical observations and strong numerical evidence. We finally apply our bounds to Polar coding for binary-input classical-quantum channels, and show the following three results: (A) Even non-stationary channels polarize under the polar transform. (B) The blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity. (C) Under the aforementioned lower bound conjecture, a blocklength polynomial in the gap suffices.Comment: 23 pages, 6 figures; v2: small correction

    Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

    Get PDF
    The sumset and inverse sumset theories of Freiman, Pl\"{u}nnecke and Ruzsa, give bounds connecting the cardinality of the sumset A+B={a+b  ;  a∈A, b∈B}A+B=\{a+b\;;\;a\in A,\,b\in B\} of two discrete sets A,BA,B, to the cardinalities (or the finer structure) of the original sets A,BA,B. For example, the sum-difference bound of Ruzsa states that, ∣A+B∣ ∣A∣ ∣B∣≤∣A−B∣3|A+B|\,|A|\,|B|\leq|A-B|^3, where the difference set A−B={a−b  ;  a∈A, b∈B}A-B= \{a-b\;;\;a\in A,\,b\in B\}. Interpreting the differential entropy h(X)h(X) of a continuous random variable XX as (the logarithm of) the size of the effective support of XX, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X+Y)+h(X)+h(Y)≤3h(X−Y)h(X+Y)+h(X)+h(Y)\leq 3h(X-Y), for any pair of independent continuous random variables XX and YY. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Pl\"{u}nnecke-Ruzsa inequality, and the Balog-Szemer\'{e}di-Gowers lemma. Also we give a differential entropy version of the Freiman-Green-Ruzsa inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X)H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X)H(X). Since differential entropy is {\em not} functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.Comment: 23 page
    • …
    corecore