20,837 research outputs found

    Second-Order Coding Rates for Conditional Rate-Distortion

    Full text link
    This paper characterizes the second-order coding rates for lossy source coding with side information available at both the encoder and the decoder. We first provide non-asymptotic bounds for this problem and then specialize the non-asymptotic bounds for three different scenarios: discrete memoryless sources, Gaussian sources, and Markov sources. We obtain the second-order coding rates for these settings. It is interesting to observe that the second-order coding rate for Gaussian source coding with Gaussian side information available at both the encoder and the decoder is the same as that for Gaussian source coding without side information. Furthermore, regardless of the variance of the side information, the dispersion is 1/21/2 nats squared per source symbol.Comment: 20 pages, 2 figures, second-order coding rates, finite blocklength, network information theor

    The Likelihood Encoder for Lossy Source Compression

    Full text link
    In this work, a likelihood encoder is studied in the context of lossy source compression. The analysis of the likelihood encoder is based on a soft-covering lemma. It is demonstrated that the use of a likelihood encoder together with the soft-covering lemma gives alternative achievability proofs for classical source coding problems. The case of the rate-distortion function with side information at the decoder (i.e. the Wyner-Ziv problem) is carefully examined and an application of the likelihood encoder to the multi-terminal source coding inner bound (i.e. the Berger-Tung region) is outlined.Comment: 5 pages, 2 figures, ISIT 201

    Joint Wyner-Ziv/Dirty Paper coding by modulo-lattice modulation

    Full text link
    The combination of source coding with decoder side-information (Wyner-Ziv problem) and channel coding with encoder side-information (Gel'fand-Pinsker problem) can be optimally solved using the separation principle. In this work we show an alternative scheme for the quadratic-Gaussian case, which merges source and channel coding. This scheme achieves the optimal performance by a applying modulo-lattice modulation to the analog source. Thus it saves the complexity of quantization and channel decoding, and remains with the task of "shaping" only. Furthermore, for high signal-to-noise ratio (SNR), the scheme approaches the optimal performance using an SNR-independent encoder, thus it is robust to unknown SNR at the encoder.Comment: Submitted to IEEE Transactions on Information Theory. Presented in part in ISIT-2006, Seattle. New version after revie

    The Likelihood Encoder for Lossy Compression

    Full text link
    A likelihood encoder is studied in the context of lossy source compression. The analysis of the likelihood encoder is based on the soft-covering lemma. It is demonstrated that the use of a likelihood encoder together with the soft-covering lemma yields simple achievability proofs for classical source coding problems. The cases of the point-to-point rate-distortion function, the rate-distortion function with side information at the decoder (i.e. the Wyner-Ziv problem), and the multi-terminal source coding inner bound (i.e. the Berger-Tung problem) are examined in this paper. Furthermore, a non-asymptotic analysis is used for the point-to-point case to examine the upper bound on the excess distortion provided by this method. The likelihood encoder is also related to a recent alternative technique using properties of random binning
    • …
    corecore