275 research outputs found

    Mutual Information and Minimum Mean-square Error in Gaussian Channels

    Full text link
    This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is chosen uniformly distributed between 0 and SNR

    Why We Can Not Surpass Capacity: The Matching Condition

    Full text link
    We show that iterative coding systems can not surpass capacity using only quantities which naturally appear in density evolution. Although the result in itself is trivial, the method which we apply shows that in order to achieve capacity the various components in an iterative coding system have to be perfectly matched. This generalizes the perfect matching condition which was previously known for the case of transmission over the binary erasure channel to the general class of binary-input memoryless output-symmetric channels. Potential applications of this perfect matching condition are the construction of capacity-achieving degree distributions and the determination of the number required iterations as a function of the multiplicative gap to capacity.Comment: 10 pages, 27 ps figures. Forty-third Allerton Conference on Communication, Control and Computing, invited pape

    A multivariate generalization of Costa's entropy power inequality

    Full text link
    A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200
    corecore