8 research outputs found

    Entropy and Graphs

    Get PDF
    The entropy of a graph is a functional depending both on the graph itself and on a probability distribution on its vertex set. This graph functional originated from the problem of source coding in information theory and was introduced by J. K\"{o}rner in 1973. Although the notion of graph entropy has its roots in information theory, it was proved to be closely related to some classical and frequently studied graph theoretic concepts. For example, it provides an equivalent definition for a graph to be perfect and it can also be applied to obtain lower bounds in graph covering problems. In this thesis, we review and investigate three equivalent definitions of graph entropy and its basic properties. Minimum entropy colouring of a graph was proposed by N. Alon in 1996. We study minimum entropy colouring and its relation to graph entropy. We also discuss the relationship between the entropy and the fractional chromatic number of a graph which was already established in the literature. A graph GG is called \emph{symmetric with respect to a functional FG(P)F_G(P)} defined on the set of all the probability distributions on its vertex set if the distribution P∗P^* maximizing FG(P)F_G(P) is uniform on V(G)V(G). Using the combinatorial definition of the entropy of a graph in terms of its vertex packing polytope and the relationship between the graph entropy and fractional chromatic number, we prove that vertex transitive graphs are symmetric with respect to graph entropy. Furthermore, we show that a bipartite graph is symmetric with respect to graph entropy if and only if it has a perfect matching. As a generalization of this result, we characterize some classes of symmetric perfect graphs with respect to graph entropy. Finally, we prove that the line graph of every bridgeless cubic graph is symmetric with respect to graph entropy.Comment: 89 pages, 4 figures, MMath Thesi

    A Distributed Computationally Aware Quantizer Design via Hyper Binning

    Full text link
    We design a distributed function aware quantization scheme for distributed functional compression. We consider 22 correlated sources X1X_1 and X2X_2 and a destination that seeks the outcome of a continuous function f(X1, X2)f(X_1,\,X_2). We develop a compression scheme called hyper binning in order to quantize ff via minimizing entropy of joint source partitioning. Hyper binning is a natural generalization of Cover's random code construction for the asymptotically optimal Slepian-Wolf encoding scheme that makes use of orthogonal binning. The key idea behind this approach is to use linear discriminant analysis in order to characterize different source feature combinations. This scheme captures the correlation between the sources and function's structure as a means of dimensionality reduction. We investigate the performance of hyper binning for different source distributions, and identify which classes of sources entail more partitioning to achieve better function approximation. Our approach brings an information theory perspective to the traditional vector quantization technique from signal processing

    Functional compression : theory and application

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; and, (S.M. in Technology and Policy)--Massachusetts Institute of Technology Engineering Systems Division, Technology and Policy Program, 2008.Includes bibliographical references (p. 75-77).We consider the problem of functional compression. The objective is to separately compress possibly correlated discrete sources such that an arbitrary deterministic function of those sources can be computed given the compressed data from each source. This is motivated by problems in sensor networks and database privacy. Our architecture gives a quantitative definition of privacy for database statistics. Further, we show that it can provide significant coding gains in sensor networks. We consider both the lossless and lossy computation of a function. Specifically, we present results of the rate regions for three instances of the problem where there are two sources: 1) lossless computation where one source is available at the decoder, 2) under a special condition, lossless computation where both sources are separately encoded, and 3) lossy computation where one source is available at the decoder. Wyner and Ziv (1976) considered the third problem for the special case f(X, Y) = X and derived a rate distortion function. Yamamoto (1982) extended this result to a general function. Both of these results are in terms of an auxiliary random variable. Orlitsky and Roche (2001), for the zero distortion case, gave this variable a precise interpretation in terms of the properties of the characteristic graph; this led to a particular coding scheme. We extend that result by providing an achievability scheme that is based on the coloring of the characteristic graph. This suggests a layered architecture where the functional layer controls the coloring scheme, and the data layer uses existing distributed source coding schemes. We extend this graph coloring method to provide algorithms and rates for all three problems.by Vishal D. Doshi.S.M

    On Minimum Entropy Graph Colorings

    No full text
    We study properties of graph colorings that minimize the quantity of color information with respect to a given probability distribution on the vertices. The minimum entropy of any coloring is the chromatic entropy. Applications of the chromatic entropy are found in coding with side information and digital image partition coding. We show that minimum entropy colorings are hard to compute even if a minimum cardinality coloring is given, the distribution is uniform, and the graph is planar. We also consider the minimum number of colors in a minimum entropy coloring, and show that this number can be arbitrarily larger than the chromatic number, even for restricted families of uniformly weighted graphs.Proc.IEEE International Symposium on Information Theory 2004 (ISIT 2004). Chicago (IL), Etats-Unis, 27 juin–2 juillet 2004SCOPUS: cp.pinfo:eu-repo/semantics/publishe

    On Minimum Entropy Graph Colorings

    No full text
    Abstract — We study properties of graph colorings that minimize the quantity of color information with respect to a given probability distribution on the vertices. The minimum entropy of any coloring is the chromatic entropy. Applications of the chromatic entropy are found in coding with side information and digital image partition coding. We show that minimum entropy colorings are hard to compute even if a minimum cardinality coloring is given, the distribution is uniform, and the graph is planar. We also consider the minimum number of colors in a minimum entropy coloring, and show that this number can be arbitrarily larger than the chromatic number, even for restricted families of uniformly weighted graphs. I. Definitions A coloring of a graph G is an assignment of colors to the vertices of G such that any two adjacent vertices have different colors. The well-known graph coloring problem is to find a coloring with as few colors as possible. The minimum number of colors in a coloring of G is the chromatic number of G, denoted by χ(G). A probabilistic graph is a graph equipped with a probability distribution on its vertices. Let (G, P) be a probabilistic graph, and let X be any random variable over the vertex set V (G) of G with distribution P. We define the entropy of a coloring φ as the entropy of the random variable φ(X). In other words, the entropy of φ is the sum over all colors i of −pi log pi, where pi = � x:φ(x)=i P (x) is the probability that X has color i. The chromatic entropy Hχ(G, P) of the probabilistic graph (G, P) is the minimum entropy of any of its colorings. It was first defined by Alon and Orlistky [3] and gives the minimum quantity of color information contained in any coloring of a probabilistic graph. Applications of this definition can be found in zero-error source coding with side information at the receiver [3, 5, 6], and compression of digital image partitions created by segmentation algorithms [1, 2]. In fact, minimum entropy colorings of a characteristic graph directly induce good codes for the two problems
    corecore