2,566 research outputs found

    Computation Over Gaussian Networks With Orthogonal Components

    Get PDF
    Function computation of arbitrarily correlated discrete sources over Gaussian networks with orthogonal components is studied. Two classes of functions are considered: the arithmetic sum function and the type function. The arithmetic sum function in this paper is defined as a set of multiple weighted arithmetic sums, which includes averaging of the sources and estimating each of the sources as special cases. The type or frequency histogram function counts the number of occurrences of each argument, which yields many important statistics such as mean, variance, maximum, minimum, median, and so on. The proposed computation coding first abstracts Gaussian networks into the corresponding modulo sum multiple-access channels via nested lattice codes and linear network coding and then computes the desired function by using linear Slepian-Wolf source coding. For orthogonal Gaussian networks (with no broadcast and multiple-access components), the computation capacity is characterized for a class of networks. For Gaussian networks with multiple-access components (but no broadcast), an approximate computation capacity is characterized for a class of networks.Comment: 30 pages, 12 figures, submitted to IEEE Transactions on Information Theor

    Cores of Cooperative Games in Information Theory

    Get PDF
    Cores of cooperative games are ubiquitous in information theory, and arise most frequently in the characterization of fundamental limits in various scenarios involving multiple users. Examples include classical settings in network information theory such as Slepian-Wolf source coding and multiple access channels, classical settings in statistics such as robust hypothesis testing, and new settings at the intersection of networking and statistics such as distributed estimation problems for sensor networks. Cooperative game theory allows one to understand aspects of all of these problems from a fresh and unifying perspective that treats users as players in a game, sometimes leading to new insights. At the heart of these analyses are fundamental dualities that have been long studied in the context of cooperative games; for information theoretic purposes, these are dualities between information inequalities on the one hand and properties of rate, capacity or other resource allocation regions on the other.Comment: 12 pages, published at http://www.hindawi.com/GetArticle.aspx?doi=10.1155/2008/318704 in EURASIP Journal on Wireless Communications and Networking, Special Issue on "Theory and Applications in Multiuser/Multiterminal Communications", April 200

    Network Information Flow with Correlated Sources

    Full text link
    In this paper, we consider a network communications problem in which multiple correlated sources must be delivered to a single data collector node, over a network of noisy independent point-to-point channels. We prove that perfect reconstruction of all the sources at the sink is possible if and only if, for all partitions of the network nodes into two subsets S and S^c such that the sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c} C_{ij}. Our main finding is that in this setup a general source/channel separation theorem holds, and that Shannon information behaves as a classical network flow, identical in nature to the flow of water in pipes. At first glance, it might seem surprising that separation holds in a fairly general network situation like the one we study. A closer look, however, reveals that the reason for this is that our model allows only for independent point-to-point channels between pairs of nodes, and not multiple-access and/or broadcast channels, for which separation is well known not to hold. This ``information as flow'' view provides an algorithmic interpretation for our results, among which perhaps the most important one is the optimality of implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information Theory -- contains (very) minor changes based on the last round of review
    • …
    corecore