302 research outputs found

    Hypergraph Markov Operators, Eigenvalues and Approximation Algorithms

    Full text link
    The celebrated Cheeger's Inequality \cite{am85,a86} establishes a bound on the expansion of a graph via its spectrum. This inequality is central to a rich spectral theory of graphs, based on studying the eigenvalues and eigenvectors of the adjacency matrix (and other related matrices) of graphs. It has remained open to define a suitable spectral model for hypergraphs whose spectra can be used to estimate various combinatorial properties of the hypergraph. In this paper we introduce a new hypergraph Laplacian operator (generalizing the Laplacian matrix of graphs)and study its spectra. We prove a Cheeger-type inequality for hypergraphs, relating the second smallest eigenvalue of this operator to the expansion of the hypergraph. We bound other hypergraph expansion parameters via higher eigenvalues of this operator. We give bounds on the diameter of the hypergraph as a function of the second smallest eigenvalue of the Laplacian operator. The Markov process underlying the Laplacian operator can be viewed as a dispersion process on the vertices of the hypergraph that might be of independent interest. We bound the {\em Mixing-time} of this process as a function of the second smallest eigenvalue of the Laplacian operator. All these results are generalizations of the corresponding results for graphs. We show that there can be no linear operator for hypergraphs whose spectra captures hypergraph expansion in a Cheeger-like manner. For any kk, we give a polynomial time algorithm to compute an approximation to the kthk^{th} smallest eigenvalue of the operator. We show that this approximation factor is optimal under the SSE hypothesis (introduced by \cite{rs10}) for constant values of kk. Finally, using the factor preserving reduction from vertex expansion in graphs to hypergraph expansion, we show that all our results for hypergraphs extend to vertex expansion in graphs

    Random Walks on Hypergraphs with Edge-Dependent Vertex Weights

    Full text link
    Hypergraphs are used in machine learning to model higher-order relationships in data. While spectral methods for graphs are well-established, spectral theory for hypergraphs remains an active area of research. In this paper, we use random walks to develop a spectral theory for hypergraphs with edge-dependent vertex weights: hypergraphs where every vertex vv has a weight γe(v)\gamma_e(v) for each incident hyperedge ee that describes the contribution of vv to the hyperedge ee. We derive a random walk-based hypergraph Laplacian, and bound the mixing time of random walks on such hypergraphs. Moreover, we give conditions under which random walks on such hypergraphs are equivalent to random walks on graphs. As a corollary, we show that current machine learning methods that rely on Laplacians derived from random walks on hypergraphs with edge-independent vertex weights do not utilize higher-order relationships in the data. Finally, we demonstrate the advantages of hypergraphs with edge-dependent vertex weights on ranking applications using real-world datasets.Comment: Accepted to ICML 201
    • …
    corecore