162 research outputs found

    Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels

    Full text link
    Within the framework of linear vector Gaussian channels with arbitrary signaling, closed-form expressions for the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, closed-form expressions for the Hessian of the mutual information and the differential entropy are derived. These expressions are then used to assess the concavity properties of mutual information and differential entropy under different channel conditions and also to derive a multivariate version of the entropy power inequality due to Costa.Comment: 33 pages, 2 figures. A shorter version of this paper is to appear in IEEE Transactions on Information Theor

    A multivariate generalization of Costa's entropy power inequality

    Full text link
    A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    The conditional entropy power inequality for quantum additive noise channels

    Get PDF
    We prove the quantum conditional Entropy Power Inequality for quantum additive noise channels. This inequality lower bounds the quantum conditional entropy of the output of an additive noise channel in terms of the quantum conditional entropies of the input state and the noise when they are conditionally independent given the memory. We also show that this conditional Entropy Power Inequality is optimal in the sense that we can achieve equality asymptotically by choosing a suitable sequence of Gaussian input states. We apply the conditional Entropy Power Inequality to find an array of information-theoretic inequalities for conditional entropies which are the analogues of inequalities which have already been established in the unconditioned setting. Furthermore, we give a simple proof of the convergence rate of the quantum Ornstein-Uhlenbeck semigroup based on Entropy Power Inequalities.Comment: 26 pages; updated to match published versio

    A strengthened entropy power inequality for log-concave densities

    Full text link
    We show that Shannon's entropy--power inequality admits a strengthened version in the case in which the densities are log-concave. In such a case, in fact, one can extend the Blachman--Stam argument to obtain a sharp inequality for the second derivative of Shannon's entropy functional with respect to the heat semigroup.Comment: 21 page

    Linear precoding for mutual information maximization in MIMO systems

    Get PDF
    We study the design of linear precoders for maximization of the mutual information in MIMO systems with arbitrary constellations and with perfect channel state information at the transmitter. We derive the structure of the optimum precoder and we show that the mutual information is concave in a quadratic function of the precoder coefficients. An iterative algorithm is also proposed to find this optimum value.Postprint (published version

    Linear Precoding for Relay Networks with Finite-Alphabet Constraints

    Full text link
    In this paper, we investigate the optimal precoding scheme for relay networks with finite-alphabet constraints. We show that the previous work utilizing various design criteria to maximize either the diversity order or the transmission rate with the Gaussian-input assumption may lead to significant loss for a practical system with finite constellation set constraint. A linear precoding scheme is proposed to maximize the mutual information for relay networks. We exploit the structure of the optimal precoding matrix and develop a unified two-step iterative algorithm utilizing the theory of convex optimization and optimization on the complex Stiefel manifold. Numerical examples show that this novel iterative algorithm achieves significant gains compared to its conventional counterpart.Comment: Accepted by IEEE Int. Conf. Commun. (ICC), Kyoto, Japan, 201
    • …
    corecore