680 research outputs found

    "Graph Entropy, Network Coding and Guessing games"

    Get PDF
    We introduce the (private) entropy of a directed graph (in a new network coding sense) as well as a number of related concepts. We show that the entropy of a directed graph is identical to its guessing number and can be bounded from below with the number of vertices minus the size of the graph’s shortest index code. We show that the Network Coding solvability of each specific multiple unicast network is completely determined by the entropy (as well as by the shortest index code) of the directed graph that occur by identifying each source node with each corresponding target node. Shannon’s information inequalities can be used to calculate up- per bounds on a graph’s entropy as well as calculating the size of the minimal index code. Recently, a number of new families of so-called non-shannon-type information inequalities have been discovered. It has been shown that there exist communication networks with a ca- pacity strictly ess than required for solvability, but where this fact cannot be derived using Shannon’s classical information inequalities. Based on this result we show that there exist graphs with an entropy that cannot be calculated using only Shannon’s classical information inequalities, and show that better estimate can be obtained by use of certain non-shannon-type information inequalities

    Conditional graph entropy as an alternating minimization problem

    Full text link
    Conditional graph entropy is known to be the minimal rate for a natural functional compression problem with side information at the receiver. In this paper we show that it can be formulated as an alternating minimization problem, which gives rise to a simple iterative algorithm for numerically computing (conditional) graph entropy. This also leads to a new formula which shows that conditional graph entropy is part of a more general framework: the solution of an optimization problem over a convex corner. In the special case of graph entropy (i.e., unconditioned version) this was known due to Csisz\'ar, K\"orner, Lov\'asz, Marton, and Simonyi. In that case the role of the convex corner was played by the so-called vertex packing polytope. In the conditional version it is a more intricate convex body but the function to minimize is the same. Furthermore, we describe a dual problem that leads to an optimality check and an error bound for the iterative algorithm

    Entropy and Graphs

    Get PDF
    The entropy of a graph is a functional depending both on the graph itself and on a probability distribution on its vertex set. This graph functional originated from the problem of source coding in information theory and was introduced by J. K\"{o}rner in 1973. Although the notion of graph entropy has its roots in information theory, it was proved to be closely related to some classical and frequently studied graph theoretic concepts. For example, it provides an equivalent definition for a graph to be perfect and it can also be applied to obtain lower bounds in graph covering problems. In this thesis, we review and investigate three equivalent definitions of graph entropy and its basic properties. Minimum entropy colouring of a graph was proposed by N. Alon in 1996. We study minimum entropy colouring and its relation to graph entropy. We also discuss the relationship between the entropy and the fractional chromatic number of a graph which was already established in the literature. A graph GG is called \emph{symmetric with respect to a functional FG(P)F_G(P)} defined on the set of all the probability distributions on its vertex set if the distribution PP^* maximizing FG(P)F_G(P) is uniform on V(G)V(G). Using the combinatorial definition of the entropy of a graph in terms of its vertex packing polytope and the relationship between the graph entropy and fractional chromatic number, we prove that vertex transitive graphs are symmetric with respect to graph entropy. Furthermore, we show that a bipartite graph is symmetric with respect to graph entropy if and only if it has a perfect matching. As a generalization of this result, we characterize some classes of symmetric perfect graphs with respect to graph entropy. Finally, we prove that the line graph of every bridgeless cubic graph is symmetric with respect to graph entropy.Comment: 89 pages, 4 figures, MMath Thesi

    Towards an approximate graph entropy measure for identifying incidents in network event data

    Get PDF
    A key objective of monitoring networks is to identify potential service threatening outages from events within the network before service is interrupted. Identifying causal events, Root Cause Analysis (RCA), is an active area of research, but current approaches are vulnerable to scaling issues with high event rates. Elimination of noisy events that are not causal is key to ensuring the scalability of RCA. In this paper, we introduce vertex-level measures inspired by Graph Entropy and propose their suitability as a categorization metric to identify nodes that are a priori of more interest as a source of events. We consider a class of measures based on Structural, Chromatic and Von Neumann Entropy. These measures require NP-Hard calculations over the whole graph, an approach which obviously does not scale for large dynamic graphs that characterise modern networks. In this work we identify and justify a local measure of vertex graph entropy, which behaves in a similar fashion to global measures of entropy when summed across the whole graph. We show that such measures are correlated with nodes that generate incidents across a network from a real data set
    corecore