50,268 research outputs found

    Correlated Sources In Distributed Networks - Data Transmission, Common Information Characterization and Inferencing

    Get PDF
    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of dependence among observations. It is well known that separated source and channel coding is optimal for point-to-point communication. However, this is not the case for multi-terminal communications. In this thesis, we study the problem of communicating correlated sources over interference channels (IC), for both the lossless and the lossy case. For lossless case, a sufficient condition is found using the technique of random source partition and correlation preserving codeword generation. The sufficient condition reduces to the Han-Kobayashi achievable rate region for IC with independent observations. Moreover, the proposed coding scheme is optimal for transmitting a special correlated sources over a class of deterministic interference channels. We then study the general case of lossy transmission of two correlated sources over a two-user discrete memoryless interference channel (DMIC). An achievable distortion region is obtained and Gaussian examples are studied. The second topic is the generalization of Wyner\u27s definition of common information of a pair of random variables to that of N random variables. Coding theorems are obtained to show that the same operational meanings for the common information of two random variables apply to that of N random variables. We establish a monotone property of Wyner\u27s common information which is in contrast to other notions of the common information, specifically Shannon\u27s mutual information and G\u27{a}cs and K {o}rner\u27s common randomness. Later, we extend Wyner\u27s common information to that of continuous random variables and provide an operational meaning using the Gray-Wyner network with lossy source coding. We show that Wyner\u27s common information equals the smallest common message rate when the total rate is arbitrarily close to the rate-distortion function with joint decoding. Finally, we consider the problem of distributed test of statistical independence under communication constraints. Focusing on the Gaussian case because of its tractability, we study in this thesis the characteristics of optimal scalar quantizers for distributed test of independence where the optimality is both in the finite sample regime and in the asymptotic regime

    Network Information Flow with Correlated Sources

    Full text link
    In this paper, we consider a network communications problem in which multiple correlated sources must be delivered to a single data collector node, over a network of noisy independent point-to-point channels. We prove that perfect reconstruction of all the sources at the sink is possible if and only if, for all partitions of the network nodes into two subsets S and S^c such that the sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c} C_{ij}. Our main finding is that in this setup a general source/channel separation theorem holds, and that Shannon information behaves as a classical network flow, identical in nature to the flow of water in pipes. At first glance, it might seem surprising that separation holds in a fairly general network situation like the one we study. A closer look, however, reveals that the reason for this is that our model allows only for independent point-to-point channels between pairs of nodes, and not multiple-access and/or broadcast channels, for which separation is well known not to hold. This ``information as flow'' view provides an algorithmic interpretation for our results, among which perhaps the most important one is the optimality of implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information Theory -- contains (very) minor changes based on the last round of review

    Bandwidth efficient multi-station wireless streaming based on complete complementary sequences

    Get PDF
    Data streaming from multiple base stations to a client is recognized as a robust technique for multimedia streaming. However the resulting transmission in parallel over wireless channels poses serious challenges, especially multiple access interference, multipath fading, noise effects and synchronization. Spread spectrum techniques seem the obvious choice to mitigate these effects, but at the cost of increased bandwidth requirements. This paper proposes a solution that exploits complete complementary spectrum spreading and data compression techniques jointly to resolve the communication challenges whilst ensuring efficient use of spectrum and acceptable bit error rate. Our proposed spreading scheme reduces the required transmission bandwidth by exploiting correlation among information present at multiple base stations. Results obtained show 1.75 Mchip/sec (or 25%) reduction in transmission rate, with only up to 6 dB loss in frequency-selective channel compared to a straightforward solution based solely on complete complementary spectrum spreading

    Communicating Correlated Sources Over an Interference Channel

    Full text link
    A new coding technique, based on \textit{fixed block-length} codes, is proposed for the problem of communicating a pair of correlated sources over a 2−2-user interference channel. Its performance is analyzed to derive a new set of sufficient conditions. The latter is proven to be strictly less binding than the current known best, which is due to Liu and Chen [Dec, 2011]. Our findings are inspired by Dueck's example [March, 1981]

    The Finite Field Multi-Way Relay Channel with Correlated Sources: The Three-User Case

    Full text link
    The three-user finite field multi-way relay channel with correlated sources is considered. The three users generate possibly correlated messages, and each user is to transmit its message to the two other users reliably in the Shannon sense. As there is no direct link among the users, communication is carried out via a relay, and the link from the users to the relay and those from the relay to the users are finite field adder channels with additive noise of arbitrary distribution. The problem is to determine the set of all possible achievable rates, defined as channel uses per source symbol for reliable communication. For two classes of source/channel combinations, the solution is obtained using Slepian-Wolf source coding combined with functional-decode-forward channel coding.Comment: to be presented at ISIT 201
    • …
    corecore