29,472 research outputs found

    Multiple access channels with arbitrarily correlated sources

    Full text link

    Joint Source-Channel Coding for Broadcast Channel with Cooperating Receivers

    Full text link
    It is known that, as opposed to point-to-point channel, separate source and channel coding is not optimal in general for sending correlated sources over multiuser channels. In some works joint source-channel coding has been investigated for some certain multiuser channels; i.g., multiple access channel (MAC) and broadcast channel (BC). In this paper, we obtain a sufficient condition for transmitting arbitrarily correlated sources over a discrete memoryless BC with cooperating receivers, where the receivers are allowed to exchange messages via a pair of noisy cooperative links. It is seen that our results is a general form of previous ones and includes them as its special cases.Comment: to appear in Proceedings of IEEE Information Theory Workshop - Fall (ITW'2015

    Source-Channel Coding Theorems for the Multiple-Access Relay Channel

    Full text link
    We study reliable transmission of arbitrarily correlated sources over multiple-access relay channels (MARCs) and multiple-access broadcast relay channels (MABRCs). In MARCs only the destination is interested in reconstructing the sources, while in MABRCs both the relay and the destination want to reconstruct them. In addition to arbitrary correlation among the source signals at the users, both the relay and the destination have side information correlated with the source signals. Our objective is to determine whether a given pair of sources can be losslessly transmitted to the destination for a given number of channel symbols per source sample, defined as the source-channel rate. Sufficient conditions for reliable communication based on operational separation, as well as necessary conditions on the achievable source-channel rates are characterized. Since operational separation is generally not optimal for MARCs and MABRCs, sufficient conditions for reliable communication using joint source-channel coding schemes based on a combination of the correlation preserving mapping technique with Slepian-Wolf source coding are also derived. For correlated sources transmitted over fading Gaussian MARCs and MABRCs, we present conditions under which separation (i.e., separate and stand-alone source and channel codes) is optimal. This is the first time optimality of separation is proved for MARCs and MABRCs.Comment: Accepted to IEEE Transaction on Information Theor

    Computation Over Gaussian Networks With Orthogonal Components

    Get PDF
    Function computation of arbitrarily correlated discrete sources over Gaussian networks with orthogonal components is studied. Two classes of functions are considered: the arithmetic sum function and the type function. The arithmetic sum function in this paper is defined as a set of multiple weighted arithmetic sums, which includes averaging of the sources and estimating each of the sources as special cases. The type or frequency histogram function counts the number of occurrences of each argument, which yields many important statistics such as mean, variance, maximum, minimum, median, and so on. The proposed computation coding first abstracts Gaussian networks into the corresponding modulo sum multiple-access channels via nested lattice codes and linear network coding and then computes the desired function by using linear Slepian-Wolf source coding. For orthogonal Gaussian networks (with no broadcast and multiple-access components), the computation capacity is characterized for a class of networks. For Gaussian networks with multiple-access components (but no broadcast), an approximate computation capacity is characterized for a class of networks.Comment: 30 pages, 12 figures, submitted to IEEE Transactions on Information Theor

    On Joint Source-Channel Coding for Correlated Sources Over Multiple-Access Relay Channels

    Get PDF
    We study the transmission of correlated sources over discrete memoryless (DM) multiple-access-relay channels (MARCs), in which both the relay and the destination have access to side information arbitrarily correlated with the sources. As the optimal transmission scheme is an open problem, in this work we propose a new joint source-channel coding scheme based on a novel combination of the correlation preserving mapping (CPM) technique with Slepian-Wolf (SW) source coding, and obtain the corresponding sufficient conditions. The proposed coding scheme is based on the decode-and-forward strategy, and utilizes CPM for encoding information simultaneously to the relay and the destination, whereas the cooperation information from the relay is encoded via SW source coding. It is shown that there are cases in which the new scheme strictly outperforms the schemes available in the literature. This is the first instance of a source-channel code that uses CPM for encoding information to two different nodes (relay and destination). In addition to sufficient conditions, we present three different sets of single-letter necessary conditions for reliable transmission of correlated sources over DM MARCs. The newly derived conditions are shown to be at least as tight as the previously known necessary conditions.Comment: Accepted to TI

    Source-Channel Coding for the Multiple-Access Relay Channel

    Full text link
    This work considers reliable transmission of general correlated sources over the multiple-access relay channel (MARC) and the multiple-access broadcast relay channel (MABRC). In MARCs only the destination is interested in a reconstruction of the sources, while in MABRCs both the relay and the destination want to reconstruct the sources. We assume that both the relay and the destination have correlated side information. We find sufficient conditions for reliable communication based on operational separation, as well as necessary conditions on the achievable source-channel rate. For correlated sources transmitted over fading Gaussian MARCs and MABRCs we find conditions under which informational separation is optimal.Comment: Presented in ISWCS 2011, Aachen, German

    Network Information Flow with Correlated Sources

    Full text link
    In this paper, we consider a network communications problem in which multiple correlated sources must be delivered to a single data collector node, over a network of noisy independent point-to-point channels. We prove that perfect reconstruction of all the sources at the sink is possible if and only if, for all partitions of the network nodes into two subsets S and S^c such that the sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c} C_{ij}. Our main finding is that in this setup a general source/channel separation theorem holds, and that Shannon information behaves as a classical network flow, identical in nature to the flow of water in pipes. At first glance, it might seem surprising that separation holds in a fairly general network situation like the one we study. A closer look, however, reveals that the reason for this is that our model allows only for independent point-to-point channels between pairs of nodes, and not multiple-access and/or broadcast channels, for which separation is well known not to hold. This ``information as flow'' view provides an algorithmic interpretation for our results, among which perhaps the most important one is the optimality of implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information Theory -- contains (very) minor changes based on the last round of review
    • …
    corecore