8,657 research outputs found

    Joint Source-Channel Coding with Time-Varying Channel and Side-Information

    Full text link
    Transmission of a Gaussian source over a time-varying Gaussian channel is studied in the presence of time-varying correlated side information at the receiver. A block fading model is considered for both the channel and the side information, whose states are assumed to be known only at the receiver. The optimality of separate source and channel coding in terms of average end-to-end distortion is shown when the channel is static while the side information state follows a discrete or a continuous and quasiconcave distribution. When both the channel and side information states are time-varying, separate source and channel coding is suboptimal in general. A partially informed encoder lower bound is studied by providing the channel state information to the encoder. Several achievable transmission schemes are proposed based on uncoded transmission, separate source and channel coding, joint decoding as well as hybrid digital-analog transmission. Uncoded transmission is shown to be optimal for a class of continuous and quasiconcave side information state distributions, while the channel gain may have an arbitrary distribution. To the best of our knowledge, this is the first example in which the uncoded transmission achieves the optimal performance thanks to the time-varying nature of the states, while it is suboptimal in the static version of the same problem. Then, the optimal \emph{distortion exponent}, that quantifies the exponential decay rate of the expected distortion in the high SNR regime, is characterized for Nakagami distributed channel and side information states, and it is shown to be achieved by hybrid digital-analog and joint decoding schemes in certain cases, illustrating the suboptimality of pure digital or analog transmission in general.Comment: Submitted to IEEE Transactions on Information Theor

    Integer-Forcing Source Coding

    Full text link
    Integer-Forcing (IF) is a new framework, based on compute-and-forward, for decoding multiple integer linear combinations from the output of a Gaussian multiple-input multiple-output channel. This work applies the IF approach to arrive at a new low-complexity scheme, IF source coding, for distributed lossy compression of correlated Gaussian sources under a minimum mean squared error distortion measure. All encoders use the same nested lattice codebook. Each encoder quantizes its observation using the fine lattice as a quantizer and reduces the result modulo the coarse lattice, which plays the role of binning. Rather than directly recovering the individual quantized signals, the decoder first recovers a full-rank set of judiciously chosen integer linear combinations of the quantized signals, and then inverts it. In general, the linear combinations have smaller average powers than the original signals. This allows to increase the density of the coarse lattice, which in turn translates to smaller compression rates. We also propose and analyze a one-shot version of IF source coding, that is simple enough to potentially lead to a new design principle for analog-to-digital converters that can exploit spatial correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor

    Joint Wyner-Ziv/Dirty Paper coding by modulo-lattice modulation

    Full text link
    The combination of source coding with decoder side-information (Wyner-Ziv problem) and channel coding with encoder side-information (Gel'fand-Pinsker problem) can be optimally solved using the separation principle. In this work we show an alternative scheme for the quadratic-Gaussian case, which merges source and channel coding. This scheme achieves the optimal performance by a applying modulo-lattice modulation to the analog source. Thus it saves the complexity of quantization and channel decoding, and remains with the task of "shaping" only. Furthermore, for high signal-to-noise ratio (SNR), the scheme approaches the optimal performance using an SNR-independent encoder, thus it is robust to unknown SNR at the encoder.Comment: Submitted to IEEE Transactions on Information Theory. Presented in part in ISIT-2006, Seattle. New version after revie

    Piggybacking Codes for Network Coding: The High/Low SNR Regime

    Full text link
    We propose a piggybacking scheme for network coding where strong source inputs piggyback the weaker ones, a scheme necessary and sufficient to achieve the cut-set upper bound at high/low-snr regime, a new asymptotically optimal operational regime for the multihop Amplify and Forward (AF) networks
    • …
    corecore