6 research outputs found
On the Minimum Mean -th Error in Gaussian Noise Channels and its Applications
The problem of estimating an arbitrary random vector from its observation
corrupted by additive white Gaussian noise, where the cost function is taken to
be the Minimum Mean -th Error (MMPE), is considered. The classical Minimum
Mean Square Error (MMSE) is a special case of the MMPE. Several bounds,
properties and applications of the MMPE are derived and discussed. The optimal
MMPE estimator is found for Gaussian and binary input distributions. Properties
of the MMPE as a function of the input distribution, SNR and order are
derived. In particular, it is shown that the MMPE is a continuous function of
and SNR. These results are possible in view of interpolation and change of
measure bounds on the MMPE. The `Single-Crossing-Point Property' (SCPP) that
bounds the MMSE for all SNR values {\it above} a certain value, at which the
MMSE is known, together with the I-MMSE relationship is a powerful tool in
deriving converse proofs in information theory. By studying the notion of
conditional MMPE, a unifying proof (i.e., for any ) of the SCPP is shown. A
complementary bound to the SCPP is then shown, which bounds the MMPE for all
SNR values {\it below} a certain value, at which the MMPE is known. As a first
application of the MMPE, a bound on the conditional differential entropy in
terms of the MMPE is provided, which then yields a generalization of the
Ozarow-Wyner lower bound on the mutual information achieved by a discrete input
on a Gaussian noise channel. As a second application, the MMPE is shown to
improve on previous characterizations of the phase transition phenomenon that
manifests, in the limit as the length of the capacity achieving code goes to
infinity, as a discontinuity of the MMSE as a function of SNR. As a final
application, the MMPE is used to show bounds on the second derivative of mutual
information, that tighten previously known bounds
Modulation and Estimation with a Helper
The problem of transmitting a parameter value over an additive white Gaussian
noise (AWGN) channel is considered, where, in addition to the transmitter and
the receiver, there is a helper that observes the noise non-causally and
provides a description of limited rate to the transmitter and/or
the receiver. We derive upper and lower bounds on the optimal achievable
-th moment of the estimation error and show that they coincide for
small values of and for low SNR values. The upper bound relies on a
recently proposed channel-coding scheme that effectively conveys
bits essentially error-free and the rest of the rate - over the same AWGN
channel without help, with the error-free bits allocated to the most
significant bits of the quantized parameter. We then concentrate on the setting
with a total transmit energy constraint, for which we derive achievability
results for both channel coding and parameter modulation for several scenarios:
when the helper assists only the transmitter or only the receiver and knows the
noise, and when the helper assists the transmitter and/or the receiver and
knows both the noise and the message. In particular, for the message-informed
helper that assists both the receiver and the transmitter, it is shown that the
error probability in the channel-coding task decays doubly exponentially.
Finally, we translate these results to those for continuous-time power-limited
AWGN channels with unconstrained bandwidth. As a byproduct, we show that the
capacity with a message-informed helper that is available only at the
transmitter can exceed the capacity of the same scenario when the helper knows
only the noise but not the message.Comment: This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessibl