36 research outputs found

    Improved Upper Bounds to the Causal Quadratic Rate-Distortion Function for Gaussian Stationary Sources

    Get PDF
    We improve the existing achievable rate regions for causal and for zero-delay source coding of stationary Gaussian sources under an average mean squared error (MSE) distortion measure. To begin with, we find a closed-form expression for the information-theoretic causal rate-distortion function (RDF) under such distortion measure, denoted by Rcit(D)R_{c}^{it}(D), for first-order Gauss-Markov processes. Rc^{it}(D) is a lower bound to the optimal performance theoretically attainable (OPTA) by any causal source code, namely Rc^{op}(D). We show that, for Gaussian sources, the latter can also be upper bounded as Rc^{op}(D)\leq Rc^{it}(D) + 0.5 log_{2}(2\pi e) bits/sample. In order to analyze Rcit(D)R_{c}^{it}(D) for arbitrary zero-mean Gaussian stationary sources, we introduce \bar{Rc^{it}}(D), the information-theoretic causal RDF when the reconstruction error is jointly stationary with the source. Based upon \bar{Rc^{it}}(D), we derive three closed-form upper bounds to the additive rate loss defined as \bar{Rc^{it}}(D) - R(D), where R(D) denotes Shannon's RDF. Two of these bounds are strictly smaller than 0.5 bits/sample at all rates. These bounds differ from one another in their tightness and ease of evaluation; the tighter the bound, the more involved its evaluation. We then show that, for any source spectral density and any positive distortion D\leq \sigma_{x}^{2}, \bar{Rc^{it}}(D) can be realized by an AWGN channel surrounded by a unique set of causal pre-, post-, and feedback filters. We show that finding such filters constitutes a convex optimization problem. In order to solve the latter, we propose an iterative optimization procedure that yields the optimal filters and is guaranteed to converge to \bar{Rc^{it}}(D). Finally, by establishing a connection to feedback quantization we design a causal and a zero-delay coding scheme which, for Gaussian sources, achieves...Comment: 47 pages, revised version submitted to IEEE Trans. Information Theor

    Applications of Information Nonanticipative Rate Distortion Function

    Full text link
    The objective of this paper is to further investigate various applications of information Nonanticipative Rate Distortion Function (NRDF) by discussing two working examples, the Binary Symmetric Markov Source with parameter pp (BSMS(pp)) with Hamming distance distortion, and the multidimensional partially observed Gaussian-Markov source. For the BSMS(pp), we give the solution to the NRDF, and we use it to compute the Rate Loss (RL) of causal codes with respect to noncausal codes. For the multidimensional Gaussian-Markov source, we give the solution to the NRDF, we show its operational meaning via joint source-channel matching over a vector of parallel Gaussian channels, and we compute the RL of causal and zero-delay codes with respect to noncausal codes.Comment: 5 pages, 3 figures, accepted for publication in IEEE International Symposium on Information Theory (ISIT) proceedings, 201

    Causal Rate Distortion Function on Abstract Alphabets: Optimal Reconstruction and Properties

    Full text link
    A causal rate distortion function with a general fidelity criterion is formulated on abstract alphabets and a coding theorem is derived. Existence of the minimizing kernel is shown using the topology of weak convergence of probability measures. The optimal reconstruction kernel is derived, which is causal, and certain properties of the causal rate distortion function are presented.Comment: 5 pages, Submitted to Internation Symposium on Information Theory(ISIT) 201

    An Upper Bound to Zero-Delay Rate Distortion via Kalman Filtering for Vector Gaussian Sources

    Full text link
    We deal with zero-delay source coding of a vector Gaussian autoregressive (AR) source subject to an average mean squared error (MSE) fidelity criterion. Toward this end, we consider the nonanticipative rate distortion function (NRDF) which is a lower bound to the causal and zero-delay rate distortion function (RDF). We use the realization scheme with feedback proposed in [1] to model the corresponding optimal "test-channel" of the NRDF, when considering vector Gaussian AR(1) sources subject to an average MSE distortion. We give conditions on the vector Gaussian AR(1) source to ensure asymptotic stationarity of the realization scheme (bounded performance). Then, we encode the vector innovations due to Kalman filtering via lattice quantization with subtractive dither and memoryless entropy coding. This coding scheme provides a tight upper bound to the zero-delay Gaussian RDF. We extend this result to vector Gaussian AR sources of any finite order. Further, we show that for infinite dimensional vector Gaussian AR sources of any finite order, the NRDF coincides with the zero-delay RDF. Our theoretical framework is corroborated with a simulation example.Comment: 7 pages, 6 figures, accepted for publication in IEEE Information Theory Workshop (ITW

    Fixed-Rate Zero-Delay Source Coding for Stationary Vector-Valued Gauss-Markov Sources

    Get PDF
    corecore