research

On mutual information, likelihood-ratios and estimation error for the additive Gaussian channel

Abstract

This paper considers the model of an arbitrary distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean square error of the non-causal estimator and the likelihood ratio between y and \omega are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean square error. These results are applied to derive infinite dimensional versions of the Fisher information and the de Bruijn identity. The derivation of the results is based on the Malliavin calculus.Comment: 21 pages, to appear in the IEEE Transactions on Information Theor

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 05/06/2019