Location of Repository

Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels

By Miquel Payaró, Daniel P. Palomar and Senior Member

Abstract

Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa. Index Terms—Concavity properties, differential entropy, entropy power, Fisher information matrix, Gaussian noise, Hessian matrices, linear vector Gaussian channels, minimum mean-squar

Topics: error (MMSE, mutual information, nonlinear estimation
Year: 2009
OAI identifier: oai:CiteSeerX.psu:10.1.1.320.6372
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.ece.ust.hk/~eepayar... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.