4,787 research outputs found

    Geometric inequalities from phase space translations

    Get PDF
    We establish a quantum version of the classical isoperimetric inequality relating the Fisher information and the entropy power of a quantum state. The key tool is a Fisher information inequality for a state which results from a certain convolution operation: the latter maps a classical probability distribution on phase space and a quantum state to a quantum state. We show that this inequality also gives rise to several related inequalities whose counterparts are well-known in the classical setting: in particular, it implies an entropy power inequality for the mentioned convolution operation as well as the isoperimetric inequality, and establishes concavity of the entropy power along trajectories of the quantum heat diffusion semigroup. As an application, we derive a Log-Sobolev inequality for the quantum Ornstein-Uhlenbeck semigroup, and argue that it implies fast convergence towards the fixed point for a large class of initial states.Comment: 37 pages; updated to match published versio

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    Brascamp-Lieb Inequality and Its Reverse: An Information Theoretic View

    Full text link
    We generalize a result by Carlen and Cordero-Erausquin on the equivalence between the Brascamp-Lieb inequality and the subadditivity of relative entropy by allowing for random transformations (a broadcast channel). This leads to a unified perspective on several functional inequalities that have been gaining popularity in the context of proving impossibility results. We demonstrate that the information theoretic dual of the Brascamp-Lieb inequality is a convenient setting for proving properties such as data processing, tensorization, convexity and Gaussian optimality. Consequences of the latter include an extension of the Brascamp-Lieb inequality allowing for Gaussian random transformations, the determination of the multivariate Wyner common information for Gaussian sources, and a multivariate version of Nelson's hypercontractivity theorem. Finally we present an information theoretic characterization of a reverse Brascamp-Lieb inequality involving a random transformation (a multiple access channel).Comment: 5 pages; to be presented at ISIT 201
    • …
    corecore