36,085 research outputs found

    Successive encoding of correlated sources

    Full text link

    Distributed Successive Approximation Coding using Broadcast Advantage: The Two-Encoder Case

    Get PDF
    Traditional distributed source coding rarely considers the possible link between separate encoders. However, the broadcast nature of wireless communication in sensor networks provides a free gossip mechanism which can be used to simplify encoding/decoding and reduce transmission power. Using this broadcast advantage, we present a new two-encoder scheme which imitates the ping-pong game and has a successive approximation structure. For the quadratic Gaussian case, we prove that this scheme is successively refinable on the {sum-rate, distortion pair} surface, which is characterized by the rate-distortion region of the distributed two-encoder source coding. A potential energy saving over conventional distributed coding is also illustrated. This ping-pong distributed coding idea can be extended to the multiple encoder case and provides the theoretical foundation for a new class of distributed image coding method in wireless scenarios.Comment: In Proceedings of the 48th Annual Allerton Conference on Communication, Control and Computing, University of Illinois, Monticello, IL, September 29 - October 1, 201

    Source-Channel Coding for the Multiple-Access Relay Channel

    Full text link
    This work considers reliable transmission of general correlated sources over the multiple-access relay channel (MARC) and the multiple-access broadcast relay channel (MABRC). In MARCs only the destination is interested in a reconstruction of the sources, while in MABRCs both the relay and the destination want to reconstruct the sources. We assume that both the relay and the destination have correlated side information. We find sufficient conditions for reliable communication based on operational separation, as well as necessary conditions on the achievable source-channel rate. For correlated sources transmitted over fading Gaussian MARCs and MABRCs we find conditions under which informational separation is optimal.Comment: Presented in ISWCS 2011, Aachen, German

    On Joint Source-Channel Coding for Correlated Sources Over Multiple-Access Relay Channels

    Get PDF
    We study the transmission of correlated sources over discrete memoryless (DM) multiple-access-relay channels (MARCs), in which both the relay and the destination have access to side information arbitrarily correlated with the sources. As the optimal transmission scheme is an open problem, in this work we propose a new joint source-channel coding scheme based on a novel combination of the correlation preserving mapping (CPM) technique with Slepian-Wolf (SW) source coding, and obtain the corresponding sufficient conditions. The proposed coding scheme is based on the decode-and-forward strategy, and utilizes CPM for encoding information simultaneously to the relay and the destination, whereas the cooperation information from the relay is encoded via SW source coding. It is shown that there are cases in which the new scheme strictly outperforms the schemes available in the literature. This is the first instance of a source-channel code that uses CPM for encoding information to two different nodes (relay and destination). In addition to sufficient conditions, we present three different sets of single-letter necessary conditions for reliable transmission of correlated sources over DM MARCs. The newly derived conditions are shown to be at least as tight as the previously known necessary conditions.Comment: Accepted to TI

    Low-Complexity Approaches to Slepian–Wolf Near-Lossless Distributed Data Compression

    Get PDF
    This paper discusses the Slepian–Wolf problem of distributed near-lossless compression of correlated sources. We introduce practical new tools for communicating at all rates in the achievable region. The technique employs a simple “source-splitting” strategy that does not require common sources of randomness at the encoders and decoders. This approach allows for pipelined encoding and decoding so that the system operates with the complexity of a single user encoder and decoder. Moreover, when this splitting approach is used in conjunction with iterative decoding methods, it produces a significant simplification of the decoding process. We demonstrate this approach for synthetically generated data. Finally, we consider the Slepian–Wolf problem when linear codes are used as syndrome-formers and consider a linear programming relaxation to maximum-likelihood (ML) sequence decoding. We note that the fractional vertices of the relaxed polytope compete with the optimal solution in a manner analogous to that observed when the “min-sum” iterative decoding algorithm is applied. This relaxation exhibits the ML-certificate property: if an integral solution is found, it is the ML solution. For symmetric binary joint distributions, we show that selecting easily constructable “expander”-style low-density parity check codes (LDPCs) as syndrome-formers admits a positive error exponent and therefore provably good performance

    Source-Channel Coding Theorems for the Multiple-Access Relay Channel

    Full text link
    We study reliable transmission of arbitrarily correlated sources over multiple-access relay channels (MARCs) and multiple-access broadcast relay channels (MABRCs). In MARCs only the destination is interested in reconstructing the sources, while in MABRCs both the relay and the destination want to reconstruct them. In addition to arbitrary correlation among the source signals at the users, both the relay and the destination have side information correlated with the source signals. Our objective is to determine whether a given pair of sources can be losslessly transmitted to the destination for a given number of channel symbols per source sample, defined as the source-channel rate. Sufficient conditions for reliable communication based on operational separation, as well as necessary conditions on the achievable source-channel rates are characterized. Since operational separation is generally not optimal for MARCs and MABRCs, sufficient conditions for reliable communication using joint source-channel coding schemes based on a combination of the correlation preserving mapping technique with Slepian-Wolf source coding are also derived. For correlated sources transmitted over fading Gaussian MARCs and MABRCs, we present conditions under which separation (i.e., separate and stand-alone source and channel codes) is optimal. This is the first time optimality of separation is proved for MARCs and MABRCs.Comment: Accepted to IEEE Transaction on Information Theor

    Integer-Forcing Source Coding

    Full text link
    Integer-Forcing (IF) is a new framework, based on compute-and-forward, for decoding multiple integer linear combinations from the output of a Gaussian multiple-input multiple-output channel. This work applies the IF approach to arrive at a new low-complexity scheme, IF source coding, for distributed lossy compression of correlated Gaussian sources under a minimum mean squared error distortion measure. All encoders use the same nested lattice codebook. Each encoder quantizes its observation using the fine lattice as a quantizer and reduces the result modulo the coarse lattice, which plays the role of binning. Rather than directly recovering the individual quantized signals, the decoder first recovers a full-rank set of judiciously chosen integer linear combinations of the quantized signals, and then inverts it. In general, the linear combinations have smaller average powers than the original signals. This allows to increase the density of the coarse lattice, which in turn translates to smaller compression rates. We also propose and analyze a one-shot version of IF source coding, that is simple enough to potentially lead to a new design principle for analog-to-digital converters that can exploit spatial correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor
    • …
    corecore