10,470 research outputs found
Recent Developments in Quantitative Graph Theory: Information Inequalities for Networks
In this article, we tackle a challenging problem in quantitative graph theory. We establish relations between graph entropy measures representing the structural information content of networks. In particular, we prove formal relations between quantitative network measures based on Shannon's entropy to study the relatedness of those measures. In order to establish such information inequalities for graphs, we focus on graph entropy measures based on information functionals. To prove such relations, we use known graph classes whose instances have been proven useful in various scientific areas. Our results extend the foregoing work on information inequalities for graphs
Information inequalities and Generalized Graph Entropies
In this article, we discuss the problem of establishing relations between
information measures assessed for network structures. Two types of entropy
based measures namely, the Shannon entropy and its generalization, the
R\'{e}nyi entropy have been considered for this study. Our main results involve
establishing formal relationship, in the form of implicit inequalities, between
these two kinds of measures when defined for graphs. Further, we also state and
prove inequalities connecting the classical partition-based graph entropies and
the functional-based entropy measures. In addition, several explicit
inequalities are derived for special classes of graphs.Comment: A preliminary version. To be submitted to a journa
Minimum cost mirror sites using network coding: Replication vs. coding at the source nodes
Content distribution over networks is often achieved by using mirror sites
that hold copies of files or portions thereof to avoid congestion and delay
issues arising from excessive demands to a single location. Accordingly, there
are distributed storage solutions that divide the file into pieces and place
copies of the pieces (replication) or coded versions of the pieces (coding) at
multiple source nodes. We consider a network which uses network coding for
multicasting the file. There is a set of source nodes that contains either
subsets or coded versions of the pieces of the file. The cost of a given
storage solution is defined as the sum of the storage cost and the cost of the
flows required to support the multicast. Our interest is in finding the storage
capacities and flows at minimum combined cost. We formulate the corresponding
optimization problems by using the theory of information measures. In
particular, we show that when there are two source nodes, there is no loss in
considering subset sources. For three source nodes, we derive a tight upper
bound on the cost gap between the coded and uncoded cases. We also present
algorithms for determining the content of the source nodes.Comment: IEEE Trans. on Information Theory (to appear), 201
Cores of Cooperative Games in Information Theory
Cores of cooperative games are ubiquitous in information theory, and arise
most frequently in the characterization of fundamental limits in various
scenarios involving multiple users. Examples include classical settings in
network information theory such as Slepian-Wolf source coding and multiple
access channels, classical settings in statistics such as robust hypothesis
testing, and new settings at the intersection of networking and statistics such
as distributed estimation problems for sensor networks. Cooperative game theory
allows one to understand aspects of all of these problems from a fresh and
unifying perspective that treats users as players in a game, sometimes leading
to new insights. At the heart of these analyses are fundamental dualities that
have been long studied in the context of cooperative games; for information
theoretic purposes, these are dualities between information inequalities on the
one hand and properties of rate, capacity or other resource allocation regions
on the other.Comment: 12 pages, published at
http://www.hindawi.com/GetArticle.aspx?doi=10.1155/2008/318704 in EURASIP
Journal on Wireless Communications and Networking, Special Issue on "Theory
and Applications in Multiuser/Multiterminal Communications", April 200
On the Distribution of Random Geometric Graphs
Random geometric graphs (RGGs) are commonly used to model networked systems
that depend on the underlying spatial embedding. We concern ourselves with the
probability distribution of an RGG, which is crucial for studying its random
topology, properties (e.g., connectedness), or Shannon entropy as a measure of
the graph's topological uncertainty (or information content). Moreover, the
distribution is also relevant for determining average network performance or
designing protocols. However, a major impediment in deducing the graph
distribution is that it requires the joint probability distribution of the
distances between nodes randomly distributed in a bounded
domain. As no such result exists in the literature, we make progress by
obtaining the joint distribution of the distances between three nodes confined
in a disk in . This enables the calculation of the probability
distribution and entropy of a three-node graph. For arbitrary , we derive a
series of upper bounds on the graph entropy; in particular, the bound involving
the entropy of a three-node graph is tighter than the existing bound which
assumes distances are independent. Finally, we provide numerical results on
graph connectedness and the tightness of the derived entropy bounds.Comment: submitted to the IEEE International Symposium on Information Theory
201
Brascamp-Lieb Inequality and Its Reverse: An Information Theoretic View
We generalize a result by Carlen and Cordero-Erausquin on the equivalence
between the Brascamp-Lieb inequality and the subadditivity of relative entropy
by allowing for random transformations (a broadcast channel). This leads to a
unified perspective on several functional inequalities that have been gaining
popularity in the context of proving impossibility results. We demonstrate that
the information theoretic dual of the Brascamp-Lieb inequality is a convenient
setting for proving properties such as data processing, tensorization,
convexity and Gaussian optimality. Consequences of the latter include an
extension of the Brascamp-Lieb inequality allowing for Gaussian random
transformations, the determination of the multivariate Wyner common information
for Gaussian sources, and a multivariate version of Nelson's hypercontractivity
theorem. Finally we present an information theoretic characterization of a
reverse Brascamp-Lieb inequality involving a random transformation (a multiple
access channel).Comment: 5 pages; to be presented at ISIT 201
- …