15,706 research outputs found
Mean Square Capacity of Power Constrained Fading Channels with Causal Encoders and Decoders
This paper is concerned with the mean square stabilization problem of
discrete-time LTI systems over a power constrained fading channel. Different
from existing research works, the channel considered in this paper suffers from
both fading and additive noises. We allow any form of causal channel
encoders/decoders, unlike linear encoders/decoders commonly studied in the
literature. Sufficient conditions and necessary conditions for the mean square
stabilizability are given in terms of channel parameters such as transmission
power and fading and additive noise statistics in relation to the unstable
eigenvalues of the open-loop system matrix. The corresponding mean square
capacity of the power constrained fading channel under causal encoders/decoders
is given. It is proved that this mean square capacity is smaller than the
corresponding Shannon channel capacity. In the end, numerical examples are
presented, which demonstrate that the causal encoders/decoders render less
restrictive stabilizability conditions than those under linear
encoders/decoders studied in the existing works.Comment: Accepted by the 54th IEEE Conference on Decision and Contro
Kalman meets Shannon
We consider the problem of communicating the state of a dynamical system via
a Shannon Gaussian channel. The receiver, which acts as both a decoder and
estimator, observes the noisy measurement of the channel output and makes an
optimal estimate of the state of the dynamical system in the minimum mean
square sense. The transmitter observes a possibly noisy measurement of the
state of the dynamical system. These measurements are then used to encode the
message to be transmitted over a noisy Gaussian channel, where a per sample
power constraint is imposed on the transmitted message. Thus, we get a mixed
problem of Shannon's source-channel coding problem and a sort of Kalman
filtering problem. We first consider the problem of communication with full
state measurements at the transmitter and show that optimal linear encoders
don't need to have memory and the optimal linear decoders have an order of at
most that of the state dimension. We also give explicitly the structure of the
optimal linear filters. For the case where the transmitter has access to noisy
measurements of the state, we derive a separation principle for the optimal
communication scheme, where the transmitter needs a filter with an order of at
most the dimension of the state of the dynamical system. The results are
derived for first order linear dynamical systems, but may be extended to MIMO
systems with arbitrary order
Mutual Information and Minimum Mean-square Error in Gaussian Channels
This paper deals with arbitrarily distributed finite-power input signals
observed through an additive Gaussian noise channel. It shows a new formula
that connects the input-output mutual information and the minimum mean-square
error (MMSE) achievable by optimal estimation of the input given the output.
That is, the derivative of the mutual information (nats) with respect to the
signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input
statistics. This relationship holds for both scalar and vector signals, as well
as for discrete-time and continuous-time noncausal MMSE estimation. This
fundamental information-theoretic result has an unexpected consequence in
continuous-time nonlinear estimation: For any input signal with finite power,
the causal filtering MMSE achieved at SNR is equal to the average value of the
noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is
chosen uniformly distributed between 0 and SNR
- …