88 research outputs found
Improved Upper Bounds to the Causal Quadratic Rate-Distortion Function for Gaussian Stationary Sources
We improve the existing achievable rate regions for causal and for zero-delay
source coding of stationary Gaussian sources under an average mean squared
error (MSE) distortion measure. To begin with, we find a closed-form expression
for the information-theoretic causal rate-distortion function (RDF) under such
distortion measure, denoted by , for first-order Gauss-Markov
processes. Rc^{it}(D) is a lower bound to the optimal performance theoretically
attainable (OPTA) by any causal source code, namely Rc^{op}(D). We show that,
for Gaussian sources, the latter can also be upper bounded as Rc^{op}(D)\leq
Rc^{it}(D) + 0.5 log_{2}(2\pi e) bits/sample. In order to analyze
for arbitrary zero-mean Gaussian stationary sources, we
introduce \bar{Rc^{it}}(D), the information-theoretic causal RDF when the
reconstruction error is jointly stationary with the source. Based upon
\bar{Rc^{it}}(D), we derive three closed-form upper bounds to the additive rate
loss defined as \bar{Rc^{it}}(D) - R(D), where R(D) denotes Shannon's RDF. Two
of these bounds are strictly smaller than 0.5 bits/sample at all rates. These
bounds differ from one another in their tightness and ease of evaluation; the
tighter the bound, the more involved its evaluation. We then show that, for any
source spectral density and any positive distortion D\leq \sigma_{x}^{2},
\bar{Rc^{it}}(D) can be realized by an AWGN channel surrounded by a unique set
of causal pre-, post-, and feedback filters. We show that finding such filters
constitutes a convex optimization problem. In order to solve the latter, we
propose an iterative optimization procedure that yields the optimal filters and
is guaranteed to converge to \bar{Rc^{it}}(D). Finally, by establishing a
connection to feedback quantization we design a causal and a zero-delay coding
scheme which, for Gaussian sources, achieves...Comment: 47 pages, revised version submitted to IEEE Trans. Information Theor
Interplay Between Transmission Delay, Average Data Rate, and Performance in Output Feedback Control over Digital Communication Channels
The performance of a noisy linear time-invariant (LTI) plant, controlled over
a noiseless digital channel with transmission delay, is investigated in this
paper. The rate-limited channel connects the single measurement output of the
plant to its single control input through a causal, but otherwise arbitrary,
coder-controller pair. An infomation-theoretic approach is utilized to analyze
the minimal average data rate required to attain the quadratic performance when
the channel imposes a known constant delay on the transmitted data. This
infimum average data rate is shown to be lower bounded by minimizing the
directed information rate across a set of LTI filters and an additive white
Gaussian noise (AWGN) channel. It is demonstrated that the presence of time
delay in the channel increases the data rate needed to achieve a certain level
of performance. The applicability of the results is verified through a
numerical example. In particular, we show by simulations that when the optimal
filters are used but the AWGN channel (used in the lower bound) is replaced by
a simple scalar uniform quantizer, the resulting operational data rates are at
most around 0.3 bits above the lower bounds.Comment: A less-detailed version of this paper has been accepted for
publication in the proceedings of ACC 201
An Upper Bound to Zero-Delay Rate Distortion via Kalman Filtering for Vector Gaussian Sources
We deal with zero-delay source coding of a vector Gaussian autoregressive
(AR) source subject to an average mean squared error (MSE) fidelity criterion.
Toward this end, we consider the nonanticipative rate distortion function
(NRDF) which is a lower bound to the causal and zero-delay rate distortion
function (RDF). We use the realization scheme with feedback proposed in [1] to
model the corresponding optimal "test-channel" of the NRDF, when considering
vector Gaussian AR(1) sources subject to an average MSE distortion. We give
conditions on the vector Gaussian AR(1) source to ensure asymptotic
stationarity of the realization scheme (bounded performance). Then, we encode
the vector innovations due to Kalman filtering via lattice quantization with
subtractive dither and memoryless entropy coding. This coding scheme provides a
tight upper bound to the zero-delay Gaussian RDF. We extend this result to
vector Gaussian AR sources of any finite order. Further, we show that for
infinite dimensional vector Gaussian AR sources of any finite order, the NRDF
coincides with the zero-delay RDF. Our theoretical framework is corroborated
with a simulation example.Comment: 7 pages, 6 figures, accepted for publication in IEEE Information
Theory Workshop (ITW
Directed Data-Processing Inequalities for Systems with Feedback
We present novel data-processing inequalities relating the mutual information
and the directed information in systems with feedback. The internal blocks
within such systems are restricted only to be causal mappings, but are allowed
to be non-linear, stochastic and time varying. These blocks can for example
represent source encoders, decoders or even communication channels. Moreover,
the involved signals can be arbitrarily distributed. Our first main result
relates mutual and directed informations and can be interpreted as a law of
conservation of information flow. Our second main result is a pair of
data-processing inequalities (one the conditional version of the other) between
nested pairs of random sequences entirely within the closed loop. Our third
main result is introducing and characterizing the notion of in-the-loop (ITL)
transmission rate for channel coding scenarios in which the messages are
internal to the loop. Interestingly, in this case the conventional notions of
transmission rate associated with the entropy of the messages and of channel
capacity based on maximizing the mutual information between the messages and
the output turn out to be inadequate. Instead, as we show, the ITL transmission
rate is the unique notion of rate for which a channel code attains zero error
probability if and only if such ITL rate does not exceed the corresponding
directed information rate from messages to decoded messages. We apply our
data-processing inequalities to show that the supremum of achievable (in the
usual channel coding sense) ITL transmission rates is upper bounded by the
supremum of the directed information rate across the communication channel.
Moreover, we present an example in which this upper bound is attained. Finally,
...Comment: Submitted to Entropy. arXiv admin note: substantial text overlap with
arXiv:1301.642
Comments on "A Framework for Control System Design Subject to Average Data-Rate Constraints"
Theorem~ 4.1 in the 2011 paper "A Framework for Control System Design Subject
to Average Data-Rate Constraints" allows one to lower bound average operational
data rates in feedback loops (including the situation in which encoder and
decoder have side information). Unfortunately, its proof is invalid.
In this note we first state the theorem and explain why its proof is flawed,
and then provide a correct proof under weaker assumptions.Comment: Submitted to IEEE Transactions on Automatic Contro
- …