1,365 research outputs found

    Improved Upper Bounds to the Causal Quadratic Rate-Distortion Function for Gaussian Stationary Sources

    Get PDF
    We improve the existing achievable rate regions for causal and for zero-delay source coding of stationary Gaussian sources under an average mean squared error (MSE) distortion measure. To begin with, we find a closed-form expression for the information-theoretic causal rate-distortion function (RDF) under such distortion measure, denoted by Rcit(D)R_{c}^{it}(D), for first-order Gauss-Markov processes. Rc^{it}(D) is a lower bound to the optimal performance theoretically attainable (OPTA) by any causal source code, namely Rc^{op}(D). We show that, for Gaussian sources, the latter can also be upper bounded as Rc^{op}(D)\leq Rc^{it}(D) + 0.5 log_{2}(2\pi e) bits/sample. In order to analyze Rcit(D)R_{c}^{it}(D) for arbitrary zero-mean Gaussian stationary sources, we introduce \bar{Rc^{it}}(D), the information-theoretic causal RDF when the reconstruction error is jointly stationary with the source. Based upon \bar{Rc^{it}}(D), we derive three closed-form upper bounds to the additive rate loss defined as \bar{Rc^{it}}(D) - R(D), where R(D) denotes Shannon's RDF. Two of these bounds are strictly smaller than 0.5 bits/sample at all rates. These bounds differ from one another in their tightness and ease of evaluation; the tighter the bound, the more involved its evaluation. We then show that, for any source spectral density and any positive distortion D\leq \sigma_{x}^{2}, \bar{Rc^{it}}(D) can be realized by an AWGN channel surrounded by a unique set of causal pre-, post-, and feedback filters. We show that finding such filters constitutes a convex optimization problem. In order to solve the latter, we propose an iterative optimization procedure that yields the optimal filters and is guaranteed to converge to \bar{Rc^{it}}(D). Finally, by establishing a connection to feedback quantization we design a causal and a zero-delay coding scheme which, for Gaussian sources, achieves...Comment: 47 pages, revised version submitted to IEEE Trans. Information Theor

    Applications of Information Nonanticipative Rate Distortion Function

    Full text link
    The objective of this paper is to further investigate various applications of information Nonanticipative Rate Distortion Function (NRDF) by discussing two working examples, the Binary Symmetric Markov Source with parameter pp (BSMS(pp)) with Hamming distance distortion, and the multidimensional partially observed Gaussian-Markov source. For the BSMS(pp), we give the solution to the NRDF, and we use it to compute the Rate Loss (RL) of causal codes with respect to noncausal codes. For the multidimensional Gaussian-Markov source, we give the solution to the NRDF, we show its operational meaning via joint source-channel matching over a vector of parallel Gaussian channels, and we compute the RL of causal and zero-delay codes with respect to noncausal codes.Comment: 5 pages, 3 figures, accepted for publication in IEEE International Symposium on Information Theory (ISIT) proceedings, 201

    An Upper Bound to Zero-Delay Rate Distortion via Kalman Filtering for Vector Gaussian Sources

    Full text link
    We deal with zero-delay source coding of a vector Gaussian autoregressive (AR) source subject to an average mean squared error (MSE) fidelity criterion. Toward this end, we consider the nonanticipative rate distortion function (NRDF) which is a lower bound to the causal and zero-delay rate distortion function (RDF). We use the realization scheme with feedback proposed in [1] to model the corresponding optimal "test-channel" of the NRDF, when considering vector Gaussian AR(1) sources subject to an average MSE distortion. We give conditions on the vector Gaussian AR(1) source to ensure asymptotic stationarity of the realization scheme (bounded performance). Then, we encode the vector innovations due to Kalman filtering via lattice quantization with subtractive dither and memoryless entropy coding. This coding scheme provides a tight upper bound to the zero-delay Gaussian RDF. We extend this result to vector Gaussian AR sources of any finite order. Further, we show that for infinite dimensional vector Gaussian AR sources of any finite order, the NRDF coincides with the zero-delay RDF. Our theoretical framework is corroborated with a simulation example.Comment: 7 pages, 6 figures, accepted for publication in IEEE Information Theory Workshop (ITW

    Zero-Delay Rate Distortion via Filtering for Vector-Valued Gaussian Sources

    Full text link
    We deal with zero-delay source coding of a vector-valued Gauss-Markov source subject to a mean-squared error (MSE) fidelity criterion characterized by the operational zero-delay vector-valued Gaussian rate distortion function (RDF). We address this problem by considering the nonanticipative RDF (NRDF) which is a lower bound to the causal optimal performance theoretically attainable (OPTA) function and operational zero-delay RDF. We recall the realization that corresponds to the optimal "test-channel" of the Gaussian NRDF, when considering a vector Gauss-Markov source subject to a MSE distortion in the finite time horizon. Then, we introduce sufficient conditions to show existence of solution for this problem in the infinite time horizon. For the asymptotic regime, we use the asymptotic characterization of the Gaussian NRDF to provide a new equivalent realization scheme with feedback which is characterized by a resource allocation (reverse-waterfilling) problem across the dimension of the vector source. We leverage the new realization to derive a predictive coding scheme via lattice quantization with subtractive dither and joint memoryless entropy coding. This coding scheme offers an upper bound to the operational zero-delay vector-valued Gaussian RDF. When we use scalar quantization, then for "r" active dimensions of the vector Gauss-Markov source the gap between the obtained lower and theoretical upper bounds is less than or equal to 0.254r + 1 bits/vector. We further show that it is possible when we use vector quantization, and assume infinite dimensional Gauss-Markov sources to make the previous gap to be negligible, i.e., Gaussian NRDF approximates the operational zero-delay Gaussian RDF. We also extend our results to vector-valued Gaussian sources of any finite memory under mild conditions. Our theoretical framework is demonstrated with illustrative numerical experiments.Comment: 32 pages, 9 figures, published in IEEE Journal of Selected Topics in Signal Processin

    Information Nonanticipative Rate Distortion Function and Its Applications

    Full text link
    This paper investigates applications of nonanticipative Rate Distortion Function (RDF) in a) zero-delay Joint Source-Channel Coding (JSCC) design based on average and excess distortion probability, b) in bounding the Optimal Performance Theoretically Attainable (OPTA) by noncausal and causal codes, and computing the Rate Loss (RL) of zero-delay and causal codes with respect to noncausal codes. These applications are described using two running examples, the Binary Symmetric Markov Source with parameter p, (BSMS(p)) and the multidimensional partially observed Gaussian-Markov source. For the multidimensional Gaussian-Markov source with square error distortion, the solution of the nonanticipative RDF is derived, its operational meaning using JSCC design via a noisy coding theorem is shown by providing the optimal encoding-decoding scheme over a vector Gaussian channel, and the RL of causal and zero-delay codes with respect to noncausal codes is computed. For the BSMS(p) with Hamming distortion, the solution of the nonanticipative RDF is derived, the RL of causal codes with respect to noncausal codes is computed, and an uncoded noisy coding theorem based on excess distortion probability is shown. The information nonanticipative RDF is shown to be equivalent to the nonanticipatory epsilon-entropy, which corresponds to the classical RDF with an additional causality or nonanticipative condition imposed on the optimal reproduction conditional distribution.Comment: 34 pages, 12 figures, part of this paper was accepted for publication in IEEE International Symposium on Information Theory (ISIT), 2014 and in book Coordination Control of Distributed Systems of series Lecture Notes in Control and Information Sciences, 201
    • …
    corecore