140 research outputs found

    Near-capacity dirty-paper code design : a source-channel coding approach

    Get PDF
    This paper examines near-capacity dirty-paper code designs based on source-channel coding. We first point out that the performance loss in signal-to-noise ratio (SNR) in our code designs can be broken into the sum of the packing loss from channel coding and a modulo loss, which is a function of the granular loss from source coding and the target dirty-paper coding rate (or SNR). We then examine practical designs by combining trellis-coded quantization (TCQ) with both systematic and nonsystematic irregular repeat-accumulate (IRA) codes. Like previous approaches, we exploit the extrinsic information transfer (EXIT) chart technique for capacity-approaching IRA code design; but unlike previous approaches, we emphasize the role of strong source coding to achieve as much granular gain as possible using TCQ. Instead of systematic doping, we employ two relatively shifted TCQ codebooks, where the shift is optimized (via tuning the EXIT charts) to facilitate the IRA code design. Our designs synergistically combine TCQ with IRA codes so that they work together as well as they do individually. By bringing together TCQ (the best quantizer from the source coding community) and EXIT chart-based IRA code designs (the best from the channel coding community), we are able to approach the theoretical limit of dirty-paper coding. For example, at 0.25 bit per symbol (b/s), our best code design (with 2048-state TCQ) performs only 0.630 dB away from the Shannon capacity

    Secret Key Agreement from Correlated Gaussian Sources by Rate Limited Public Communication

    Full text link
    We investigate the secret key agreement from correlated Gaussian sources in which the legitimate parties can use the public communication with limited rate. For the class of protocols with the one-way public communication, we show a closed form expression of the optimal trade-off between the rate of key generation and the rate of the public communication. Our results clarify an essential difference between the key agreement from discrete sources and that from continuous sources.Comment: 9 pages, no figure, Version 2 is a published version. The results are not changed from version 1. Explanations are polishe

    Lossy Source Transmission over the Relay Channel

    Full text link
    Lossy transmission over a relay channel in which the relay has access to correlated side information is considered. First, a joint source-channel decode-and-forward scheme is proposed for general discrete memoryless sources and channels. Then the Gaussian relay channel where the source and the side information are jointly Gaussian is analyzed. For this Gaussian model, several new source-channel cooperation schemes are introduced and analyzed in terms of the squared-error distortion at the destination. A comparison of the proposed upper bounds with the cut-set lower bound is given, and it is seen that joint source-channel cooperation improves the reconstruction quality significantly. Moreover, the performance of the joint code is close to the lower bound on distortion for a wide range of source and channel parameters.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    Capacity of a Class of Deterministic Relay Channels

    Full text link
    The capacity of a class of deterministic relay channels with the transmitter input X, the receiver output Y, the relay output Y_1 = f(X, Y), and a separate communication link from the relay to the receiver with capacity R_0, is shown to be C(R_0) = \max_{p(x)} \min \{I(X;Y)+R_0, I(X;Y, Y_1) \}. Thus every bit from the relay is worth exactly one bit to the receiver. Two alternative coding schemes are presented that achieve this capacity. The first scheme, ``hash-and-forward'', is based on a simple yet novel use of random binning on the space of relay outputs, while the second scheme uses the usual ``compress-and-forward''. In fact, these two schemes can be combined together to give a class of optimal coding schemes. As a corollary, this relay capacity result confirms a conjecture by Ahlswede and Han on the capacity of a channel with rate-limited state information at the decoder in the special case when the channel state is recoverable from the channel input and the output.Comment: 17 pages, submitted to IEEE Transactions on Information Theor
    corecore