501 research outputs found

    A strengthened entropy power inequality for log-concave densities

    Full text link
    We show that Shannon's entropy--power inequality admits a strengthened version in the case in which the densities are log-concave. In such a case, in fact, one can extend the Blachman--Stam argument to obtain a sharp inequality for the second derivative of Shannon's entropy functional with respect to the heat semigroup.Comment: 21 page

    R\'enyi Entropy Power Inequalities via Normal Transport and Rotation

    Full text link
    Following a recent proof of Shannon's entropy power inequality (EPI), a comprehensive framework for deriving various EPIs for the R\'enyi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation. Simple arguments are given to recover the previously known R\'enyi EPIs and derive new ones, by unifying a multiplicative form with constant c and a modification with exponent {\alpha} of previous works. In particular, for log-concave densities, we obtain a simple transportation proof of a sharp varentropy bound.Comment: 17 page. Entropy Journal, to appea

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio
    • …
    corecore