1,050 research outputs found
Optimum estimation via gradients of partition functions and information measures: a statistical-mechanical perspective
In continuation to a recent work on the statistical--mechanical analysis of
minimum mean square error (MMSE) estimation in Gaussian noise via its relation
to the mutual information (the I-MMSE relation), here we propose a simple and
more direct relationship between optimum estimation and certain information
measures (e.g., the information density and the Fisher information), which can
be viewed as partition functions and hence are amenable to analysis using
statistical--mechanical techniques. The proposed approach has several
advantages, most notably, its applicability to general sources and channels, as
opposed to the I-MMSE relation and its variants which hold only for certain
classes of channels (e.g., additive white Gaussian noise channels). We then
demonstrate the derivation of the conditional mean estimator and the MMSE in a
few examples. Two of these examples turn out to be generalizable to a fairly
wide class of sources and channels. For this class, the proposed approach is
shown to yield an approximate conditional mean estimator and an MMSE formula
that has the flavor of a single-letter expression. We also show how our
approach can easily be generalized to situations of mismatched estimation.Comment: 21 pages; submitted to the IEEE Transactions on Information Theor
Analysis of Mismatched Estimation Errors Using Gradients of Partition Functions
We consider the problem of signal estimation (denoising) from a
statistical-mechanical perspective, in continuation to a recent work on the
analysis of mean-square error (MSE) estimation using a direct relationship
between optimum estimation and certain partition functions. The paper consists
of essentially two parts. In the first part, using the aforementioned
relationship, we derive single-letter expressions of the mismatched MSE of a
codeword (from a randomly selected code), corrupted by a Gaussian vector
channel. In the second part, we provide several examples to demonstrate phase
transitions in the behavior of the MSE. These examples enable us to understand
more deeply and to gather intuition regarding the roles of the real and the
mismatched probability measures in creating these phase transitions.Comment: 58 pages;Submitted to IEEE Trans. on Information Theor
Achievable Outage Rates with Improved Decoding of Bicm Multiband Ofdm Under Channel Estimation Errors
We consider the decoding of bit interleaved coded modulation (BICM) applied
to multiband OFDM for practical scenarios where only a noisy (possibly very
bad) estimate of the channel is available at the receiver. First, a decoding
metric based on the channel it a posteriori probability density, conditioned on
the channel estimate is derived and used for decoding BICM multiband OFDM.
Then, we characterize the limits of reliable information rates in terms of the
maximal achievable outage rates associated to the proposed metric. We also
compare our results with the outage rates of a system using a theoretical
decoder. Our results are useful for designing a communication system where a
prescribed quality of service (QoS), in terms of achievable target rates with
small error probability, must be satisfied even in the presence of imperfect
channel estimation. Numerical results over both realistic UWB and theoretical
Rayleigh fading channels show that the proposed method provides significant
gain in terms of BER and outage rates compared to the classical mismatched
detector, without introducing any additional complexity
Pointwise Relations between Information and Estimation in Gaussian Noise
Many of the classical and recent relations between information and estimation
in the presence of Gaussian noise can be viewed as identities between
expectations of random quantities. These include the I-MMSE relationship of Guo
et al.; the relative entropy and mismatched estimation relationship of
Verd\'{u}; the relationship between causal estimation and mutual information of
Duncan, and its extension to the presence of feedback by Kadota et al.; the
relationship between causal and non-casual estimation of Guo et al., and its
mismatched version of Weissman. We dispense with the expectations and explore
the nature of the pointwise relations between the respective random quantities.
The pointwise relations that we find are as succinctly stated as - and give
considerable insight into - the original expectation identities.
As an illustration of our results, consider Duncan's 1970 discovery that the
mutual information is equal to the causal MMSE in the AWGN channel, which can
equivalently be expressed saying that the difference between the input-output
information density and half the causal estimation error is a zero mean random
variable (regardless of the distribution of the channel input). We characterize
this random variable explicitly, rather than merely its expectation. Classical
estimation and information theoretic quantities emerge with new and surprising
roles. For example, the variance of this random variable turns out to be given
by the causal MMSE (which, in turn, is equal to the mutual information by
Duncan's result).Comment: 31 pages, 2 figures, submitted to IEEE Transactions on Information
Theor
- …