17,171 research outputs found

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.Comment: 5 pages, accepted for presentation at the IEEE International Symposium on Information Theory 200

    Higher Order Derivatives in Costa's Entropy Power Inequality

    Full text link
    Let XX be an arbitrary continuous random variable and ZZ be an independent Gaussian random variable with zero mean and unit variance. For t > 0t~>~0, Costa proved that e2h(X+tZ)e^{2h(X+\sqrt{t}Z)} is concave in tt, where the proof hinged on the first and second order derivatives of h(X+tZ)h(X+\sqrt{t}Z). Specifically, these two derivatives are signed, i.e., th(X+tZ)0\frac{\partial}{\partial t}h(X+\sqrt{t}Z) \geq 0 and 2t2h(X+tZ)0\frac{\partial^2}{\partial t^2}h(X+\sqrt{t}Z) \leq 0. In this paper, we show that the third order derivative of h(X+tZ)h(X+\sqrt{t}Z) is nonnegative, which implies that the Fisher information J(X+tZ)J(X+\sqrt{t}Z) is convex in tt. We further show that the fourth order derivative of h(X+tZ)h(X+\sqrt{t}Z) is nonpositive. Following the first four derivatives, we make two conjectures on h(X+tZ)h(X+\sqrt{t}Z): the first is that ntnh(X+tZ)\frac{\partial^n}{\partial t^n} h(X+\sqrt{t}Z) is nonnegative in tt if nn is odd, and nonpositive otherwise; the second is that logJ(X+tZ)\log J(X+\sqrt{t}Z) is convex in tt. The first conjecture can be rephrased in the context of completely monotone functions: J(X+tZ)J(X+\sqrt{t}Z) is completely monotone in tt. The history of the first conjecture may date back to a problem in mathematical physics studied by McKean in 1966. Apart from these results, we provide a geometrical interpretation to the covariance-preserving transformation and study the concavity of h(tX+1tZ)h(\sqrt{t}X+\sqrt{1-t}Z), revealing its connection with Costa's EPI.Comment: Second version submitted. https://sites.google.com/site/chengfancuhk

    Geometric inequalities from phase space translations

    Get PDF
    We establish a quantum version of the classical isoperimetric inequality relating the Fisher information and the entropy power of a quantum state. The key tool is a Fisher information inequality for a state which results from a certain convolution operation: the latter maps a classical probability distribution on phase space and a quantum state to a quantum state. We show that this inequality also gives rise to several related inequalities whose counterparts are well-known in the classical setting: in particular, it implies an entropy power inequality for the mentioned convolution operation as well as the isoperimetric inequality, and establishes concavity of the entropy power along trajectories of the quantum heat diffusion semigroup. As an application, we derive a Log-Sobolev inequality for the quantum Ornstein-Uhlenbeck semigroup, and argue that it implies fast convergence towards the fixed point for a large class of initial states.Comment: 37 pages; updated to match published versio

    Maximal correlation and the rate of Fisher information convergence in the Central Limit Theorem

    Full text link
    We consider the behaviour of the Fisher information of scaled sums of independent and identically distributed random variables in the Central Limit Theorem regime. We show how this behaviour can be related to the second-largest non-trivial eigenvalue associated with the Hirschfeld--Gebelein--R\'{e}nyi maximal correlation. We prove that assuming this eigenvalue satisfies a strict inequality, an O(1/n)O(1/n) rate of convergence and a strengthened form of monotonicity hold
    corecore