2,391 research outputs found

    Nested turbo codes for the costa problem

    Get PDF
    Driven by applications in data-hiding, MIMO broadcast channel coding, precoding for interference cancellation, and transmitter cooperation in wireless networks, Costa coding has lately become a very active research area. In this paper, we first offer code design guidelines in terms of source- channel coding for algebraic binning. We then address practical code design based on nested lattice codes and propose nested turbo codes using turbo-like trellis-coded quantization (TCQ) for source coding and turbo trellis-coded modulation (TTCM) for channel coding. Compared to TCQ, turbo-like TCQ offers structural similarity between the source and channel coding components, leading to more efficient nesting with TTCM and better source coding performance. Due to the difference in effective dimensionality between turbo-like TCQ and TTCM, there is a performance tradeoff between these two components when they are nested together, meaning that the performance of turbo-like TCQ worsens as the TTCM code becomes stronger and vice versa. Optimization of this performance tradeoff leads to our code design that outperforms existing TCQ/TCM and TCQ/TTCM constructions and exhibits a gap of 0.94, 1.42 and 2.65 dB to the Costa capacity at 2.0, 1.0, and 0.5 bits/sample, respectively

    Nested Lattice Codes for Gaussian Relay Networks with Interference

    Full text link
    In this paper, a class of relay networks is considered. We assume that, at a node, outgoing channels to its neighbors are orthogonal, while incoming signals from neighbors can interfere with each other. We are interested in the multicast capacity of these networks. As a subclass, we first focus on Gaussian relay networks with interference and find an achievable rate using a lattice coding scheme. It is shown that there is a constant gap between our achievable rate and the information theoretic cut-set bound. This is similar to the recent result by Avestimehr, Diggavi, and Tse, who showed such an approximate characterization of the capacity of general Gaussian relay networks. However, our achievability uses a structured code instead of a random one. Using the same idea used in the Gaussian case, we also consider linear finite-field symmetric networks with interference and characterize the capacity using a linear coding scheme.Comment: 23 pages, 5 figures, submitted to IEEE Transactions on Information Theor

    Secure Compute-and-Forward in a Bidirectional Relay

    Full text link
    We consider the basic bidirectional relaying problem, in which two users in a wireless network wish to exchange messages through an intermediate relay node. In the compute-and-forward strategy, the relay computes a function of the two messages using the naturally-occurring sum of symbols simultaneously transmitted by user nodes in a Gaussian multiple access (MAC) channel, and the computed function value is forwarded to the user nodes in an ensuing broadcast phase. In this paper, we study the problem under an additional security constraint, which requires that each user's message be kept secure from the relay. We consider two types of security constraints: perfect secrecy, in which the MAC channel output seen by the relay is independent of each user's message; and strong secrecy, which is a form of asymptotic independence. We propose a coding scheme based on nested lattices, the main feature of which is that given a pair of nested lattices that satisfy certain "goodness" properties, we can explicitly specify probability distributions for randomization at the encoders to achieve the desired security criteria. In particular, our coding scheme guarantees perfect or strong secrecy even in the absence of channel noise. The noise in the channel only affects reliability of computation at the relay, and for Gaussian noise, we derive achievable rates for reliable and secure computation. We also present an application of our methods to the multi-hop line network in which a source needs to transmit messages to a destination through a series of intermediate relays.Comment: v1 is a much expanded and updated version of arXiv:1204.6350; v2 is a minor revision to fix some notational issues; v3 is a much expanded and updated version of v2, and contains results on both perfect secrecy and strong secrecy; v3 is a revised manuscript submitted to the IEEE Transactions on Information Theory in April 201

    Integer-Forcing Source Coding

    Full text link
    Integer-Forcing (IF) is a new framework, based on compute-and-forward, for decoding multiple integer linear combinations from the output of a Gaussian multiple-input multiple-output channel. This work applies the IF approach to arrive at a new low-complexity scheme, IF source coding, for distributed lossy compression of correlated Gaussian sources under a minimum mean squared error distortion measure. All encoders use the same nested lattice codebook. Each encoder quantizes its observation using the fine lattice as a quantizer and reduces the result modulo the coarse lattice, which plays the role of binning. Rather than directly recovering the individual quantized signals, the decoder first recovers a full-rank set of judiciously chosen integer linear combinations of the quantized signals, and then inverts it. In general, the linear combinations have smaller average powers than the original signals. This allows to increase the density of the coarse lattice, which in turn translates to smaller compression rates. We also propose and analyze a one-shot version of IF source coding, that is simple enough to potentially lead to a new design principle for analog-to-digital converters that can exploit spatial correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor
    corecore