5,614 research outputs found

    Lossy Compression with Near-uniform Encoder Outputs

    Full text link
    It is well known that lossless compression of a discrete memoryless source with near-uniform encoder output is possible at a rate above its entropy if and only if the encoder is randomized. This work focuses on deriving conditions for near-uniform encoder output(s) in the Wyner-Ziv and the distributed lossy compression problems. We show that in the Wyner-Ziv problem, near-uniform encoder output and operation close to the WZ-rate limit is simultaneously possible, whereas in the distributed lossy compression problem, jointly near-uniform outputs is achievable in the interior of the distributed lossy compression rate region if the sources share non-trivial G\'{a}cs-K\"{o}rner common information.Comment: Submitted to the 2016 IEEE International Symposium on Information Theory (11 Pages, 3 Figures

    Distributed Channel Synthesis

    Full text link
    Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal trade-off between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this work also generalizes and strengthens this soft covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012, accepted July, 2013), 26 pages, using IEEEtran.cl

    Coding Schemes for Achieving Strong Secrecy at Negligible Cost

    Full text link
    We study the problem of achieving strong secrecy over wiretap channels at negligible cost, in the sense of maintaining the overall communication rate of the same channel without secrecy constraints. Specifically, we propose and analyze two source-channel coding architectures, in which secrecy is achieved by multiplexing public and confidential messages. In both cases, our main contribution is to show that secrecy can be achieved without compromising communication rate and by requiring only randomness of asymptotically vanishing rate. Our first source-channel coding architecture relies on a modified wiretap channel code, in which randomization is performed using the output of a source code. In contrast, our second architecture relies on a standard wiretap code combined with a modified source code termed uniform compression code, in which a small shared secret seed is used to enhance the uniformity of the source code output. We carry out a detailed analysis of uniform compression codes and characterize the optimal size of the shared seed.Comment: 15 pages, two-column, 5 figures, accepted to IEEE Transactions on Information Theor
    • …
    corecore