34 research outputs found

    The Likelihood Encoder for Lossy Compression

    Full text link
    A likelihood encoder is studied in the context of lossy source compression. The analysis of the likelihood encoder is based on the soft-covering lemma. It is demonstrated that the use of a likelihood encoder together with the soft-covering lemma yields simple achievability proofs for classical source coding problems. The cases of the point-to-point rate-distortion function, the rate-distortion function with side information at the decoder (i.e. the Wyner-Ziv problem), and the multi-terminal source coding inner bound (i.e. the Berger-Tung problem) are examined in this paper. Furthermore, a non-asymptotic analysis is used for the point-to-point case to examine the upper bound on the excess distortion provided by this method. The likelihood encoder is also related to a recent alternative technique using properties of random binning

    An upper bound on relaying over capacity based on channel simulation

    Full text link
    The upper bound on the capacity of a 3-node discrete memoryless relay channel is considered, where a source X wants to send information to destination Y with the help of a relay Z. Y and Z are independent given X, and the link from Z to Y is lossless with rate R0R_0. A new inequality is introduced to upper-bound the capacity when the encoding rate is beyond the capacities of both individual links XY and XZ. It is based on generalization of the blowing-up lemma, linking conditional entropy to decoding error, and channel simulation, to the case with side information. The achieved upper-bound is strictly better than the well-known cut-set bound in several cases when the latter is CXY+R0C_{XY}+R_0, with CXYC_{XY} being the channel capacity between X and Y. One particular case is when the channel is statistically degraded, i.e., either Y is a statistically degraded version of Z with respect to X, or Z is a statistically degraded version of Y with respect to X. Moreover in this case, the bound is shown to be explicitly computable. The binary erasure channel is analyzed in detail and evaluated numerically.Comment: Submitted to IEEE Transactions on Information Theory, 21 pages, 6 figure

    Secure Cascade Channel Synthesis

    Full text link
    We consider the problem of generating correlated random variables in a distributed fashion, where communication is constrained to a cascade network. The first node in the cascade observes an i.i.d. sequence XnX^n locally before initiating communication along the cascade. All nodes share bits of common randomness that are independent of XnX^n. We consider secure synthesis - random variables produced by the system appear to be appropriately correlated and i.i.d. even to an eavesdropper who is cognizant of the communication transmissions. We characterize the optimal tradeoff between the amount of common randomness used and the required rates of communication. We find that not only does common randomness help, its usage exceeds the communication rate requirements. The most efficient scheme is based on a superposition codebook, with the first node selecting messages for all downstream nodes. We also provide a fleeting view of related problems, demonstrating how the optimal rate region may shrink or expand.Comment: Submitted to IEEE Transactions on Information Theor
    corecore