67,350 research outputs found
An Upper Bound to Zero-Delay Rate Distortion via Kalman Filtering for Vector Gaussian Sources
We deal with zero-delay source coding of a vector Gaussian autoregressive
(AR) source subject to an average mean squared error (MSE) fidelity criterion.
Toward this end, we consider the nonanticipative rate distortion function
(NRDF) which is a lower bound to the causal and zero-delay rate distortion
function (RDF). We use the realization scheme with feedback proposed in [1] to
model the corresponding optimal "test-channel" of the NRDF, when considering
vector Gaussian AR(1) sources subject to an average MSE distortion. We give
conditions on the vector Gaussian AR(1) source to ensure asymptotic
stationarity of the realization scheme (bounded performance). Then, we encode
the vector innovations due to Kalman filtering via lattice quantization with
subtractive dither and memoryless entropy coding. This coding scheme provides a
tight upper bound to the zero-delay Gaussian RDF. We extend this result to
vector Gaussian AR sources of any finite order. Further, we show that for
infinite dimensional vector Gaussian AR sources of any finite order, the NRDF
coincides with the zero-delay RDF. Our theoretical framework is corroborated
with a simulation example.Comment: 7 pages, 6 figures, accepted for publication in IEEE Information
Theory Workshop (ITW
Information Nonanticipative Rate Distortion Function and Its Applications
This paper investigates applications of nonanticipative Rate Distortion
Function (RDF) in a) zero-delay Joint Source-Channel Coding (JSCC) design based
on average and excess distortion probability, b) in bounding the Optimal
Performance Theoretically Attainable (OPTA) by noncausal and causal codes, and
computing the Rate Loss (RL) of zero-delay and causal codes with respect to
noncausal codes. These applications are described using two running examples,
the Binary Symmetric Markov Source with parameter p, (BSMS(p)) and the
multidimensional partially observed Gaussian-Markov source. For the
multidimensional Gaussian-Markov source with square error distortion, the
solution of the nonanticipative RDF is derived, its operational meaning using
JSCC design via a noisy coding theorem is shown by providing the optimal
encoding-decoding scheme over a vector Gaussian channel, and the RL of causal
and zero-delay codes with respect to noncausal codes is computed.
For the BSMS(p) with Hamming distortion, the solution of the nonanticipative
RDF is derived, the RL of causal codes with respect to noncausal codes is
computed, and an uncoded noisy coding theorem based on excess distortion
probability is shown. The information nonanticipative RDF is shown to be
equivalent to the nonanticipatory epsilon-entropy, which corresponds to the
classical RDF with an additional causality or nonanticipative condition imposed
on the optimal reproduction conditional distribution.Comment: 34 pages, 12 figures, part of this paper was accepted for publication
in IEEE International Symposium on Information Theory (ISIT), 2014 and in
book Coordination Control of Distributed Systems of series Lecture Notes in
Control and Information Sciences, 201
A zero-delay sequential scheme for lossy coding of individual sequences
We consider adaptive sequential lossy coding of bounded individual sequences when the performance is measured by the sequentially accumulated mean squared distortion. The encoder and the decoder are connected via a noiseless channel of capacity and both are assumed to have zero delay. No probabilistic assumptions are made on how the sequence to be encoded is generated. For any bounded sequence of length , the distortion redundancy is defined as the normalized cumulative distortion of the sequential scheme minus the normalized cumulative distortion of the best scalar quantizer of rate which is matched to this particular sequence. We demonstrate the existence of a zero-delay sequential scheme which uses common randomization in the encoder and the decoder such that the normalized maximum distortion redundancy converges to zero at a rate as the length of the encoded sequence increases without bound.Lossy source coding, scalar quantization, sequential prediction, individual sequences
- …