3,168 research outputs found

    Successive Refinement with Decoder Cooperation and its Channel Coding Duals

    Full text link
    We study cooperation in multi terminal source coding models involving successive refinement. Specifically, we study the case of a single encoder and two decoders, where the encoder provides a common description to both the decoders and a private description to only one of the decoders. The decoders cooperate via cribbing, i.e., the decoder with access only to the common description is allowed to observe, in addition, a deterministic function of the reconstruction symbols produced by the other. We characterize the fundamental performance limits in the respective settings of non-causal, strictly-causal and causal cribbing. We use a new coding scheme, referred to as Forward Encoding and Block Markov Decoding, which is a variant of one recently used by Cuff and Zhao for coordination via implicit communication. Finally, we use the insight gained to introduce and solve some dual channel coding scenarios involving Multiple Access Channels with cribbing.Comment: 55 pages, 15 figures, 8 tables, submitted to IEEE Transactions on Information Theory. A shorter version submitted to ISIT 201

    Lecture Notes on Network Information Theory

    Full text link
    These lecture notes have been converted to a book titled Network Information Theory published recently by Cambridge University Press. This book provides a significantly expanded exposition of the material in the lecture notes as well as problems and bibliographic notes at the end of each chapter. The authors are currently preparing a set of slides based on the book that will be posted in the second half of 2012. More information about the book can be found at http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/

    Side-information Scalable Source Coding

    Full text link
    The problem of side-information scalable (SI-scalable) source coding is considered in this work, where the encoder constructs a progressive description, such that the receiver with high quality side information will be able to truncate the bitstream and reconstruct in the rate distortion sense, while the receiver with low quality side information will have to receive further data in order to decode. We provide inner and outer bounds for general discrete memoryless sources. The achievable region is shown to be tight for the case that either of the decoders requires a lossless reconstruction, as well as the case with degraded deterministic distortion measures. Furthermore we show that the gap between the achievable region and the outer bounds can be bounded by a constant when square error distortion measure is used. The notion of perfectly scalable coding is introduced as both the stages operate on the Wyner-Ziv bound, and necessary and sufficient conditions are given for sources satisfying a mild support condition. Using SI-scalable coding and successive refinement Wyner-Ziv coding as basic building blocks, a complete characterization is provided for the important quadratic Gaussian source with multiple jointly Gaussian side-informations, where the side information quality does not have to be monotonic along the scalable coding order. Partial result is provided for the doubly symmetric binary source with Hamming distortion when the worse side information is a constant, for which one of the outer bound is strictly tighter than the other one.Comment: 35 pages, submitted to IEEE Transaction on Information Theor

    Secure Multiterminal Source Coding with Side Information at the Eavesdropper

    Full text link
    The problem of secure multiterminal source coding with side information at the eavesdropper is investigated. This scenario consists of a main encoder (referred to as Alice) that wishes to compress a single source but simultaneously satisfying the desired requirements on the distortion level at a legitimate receiver (referred to as Bob) and the equivocation rate --average uncertainty-- at an eavesdropper (referred to as Eve). It is further assumed the presence of a (public) rate-limited link between Alice and Bob. In this setting, Eve perfectly observes the information bits sent by Alice to Bob and has also access to a correlated source which can be used as side information. A second encoder (referred to as Charlie) helps Bob in estimating Alice's source by sending a compressed version of its own correlated observation via a (private) rate-limited link, which is only observed by Bob. For instance, the problem at hands can be seen as the unification between the Berger-Tung and the secure source coding setups. Inner and outer bounds on the so called rates-distortion-equivocation region are derived. The inner region turns to be tight for two cases: (i) uncoded side information at Bob and (ii) lossless reconstruction of both sources at Bob --secure distributed lossless compression. Application examples to secure lossy source coding of Gaussian and binary sources in the presence of Gaussian and binary/ternary (resp.) side informations are also considered. Optimal coding schemes are characterized for some cases of interest where the statistical differences between the side information at the decoders and the presence of a non-zero distortion at Bob can be fully exploited to guarantee secrecy.Comment: 26 pages, 16 figures, 2 table

    Lossy Source Transmission over the Relay Channel

    Full text link
    Lossy transmission over a relay channel in which the relay has access to correlated side information is considered. First, a joint source-channel decode-and-forward scheme is proposed for general discrete memoryless sources and channels. Then the Gaussian relay channel where the source and the side information are jointly Gaussian is analyzed. For this Gaussian model, several new source-channel cooperation schemes are introduced and analyzed in terms of the squared-error distortion at the destination. A comparison of the proposed upper bounds with the cut-set lower bound is given, and it is seen that joint source-channel cooperation improves the reconstruction quality significantly. Moreover, the performance of the joint code is close to the lower bound on distortion for a wide range of source and channel parameters.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200
    • …
    corecore