5,387 research outputs found

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.Comment: 5 pages, accepted for presentation at the IEEE International Symposium on Information Theory 200

    The information-theoretic meaning of Gagliardo--Nirenberg type inequalities

    Full text link
    Gagliardo--Nirenberg inequalities are interpolation inequalities which were proved independently by Gagliardo and Nirenberg in the late fifties. In recent years, their connections with theoretic aspects of information theory and nonlinear diffusion equations allowed to obtain some of them in optimal form, by recovering both the sharp constants and the explicit form of the optimizers. In this note, at the light of these recent researches, we review the main connections between Shannon-type entropies, diffusion equations and a class of these inequalities

    The conditional entropy power inequality for quantum additive noise channels

    Get PDF
    We prove the quantum conditional Entropy Power Inequality for quantum additive noise channels. This inequality lower bounds the quantum conditional entropy of the output of an additive noise channel in terms of the quantum conditional entropies of the input state and the noise when they are conditionally independent given the memory. We also show that this conditional Entropy Power Inequality is optimal in the sense that we can achieve equality asymptotically by choosing a suitable sequence of Gaussian input states. We apply the conditional Entropy Power Inequality to find an array of information-theoretic inequalities for conditional entropies which are the analogues of inequalities which have already been established in the unconditioned setting. Furthermore, we give a simple proof of the convergence rate of the quantum Ornstein-Uhlenbeck semigroup based on Entropy Power Inequalities.Comment: 26 pages; updated to match published versio

    A Unifying Variational Perspective on Some Fundamental Information Theoretic Inequalities

    Full text link
    This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cram\'er-Rao inequality), worst additive noise lemma, entropy power inequality (EPI), and extremal entropy inequality (EEI) are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned

    Yet Another Proof of the Entropy Power Inequality

    Full text link
    Yet another simple proof of the entropy power inequality is given, which avoids both the integration over a path of Gaussian perturbation and the use of Young's inequality with sharp constant or R\'enyi entropies. The proof is based on a simple change of variables, is formally identical in one and several dimensions, and easily settles the equality case

    R\'enyi Entropy Power Inequalities via Normal Transport and Rotation

    Full text link
    Following a recent proof of Shannon's entropy power inequality (EPI), a comprehensive framework for deriving various EPIs for the R\'enyi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation. Simple arguments are given to recover the previously known R\'enyi EPIs and derive new ones, by unifying a multiplicative form with constant c and a modification with exponent {\alpha} of previous works. In particular, for log-concave densities, we obtain a simple transportation proof of a sharp varentropy bound.Comment: 17 page. Entropy Journal, to appea
    • …
    corecore