371 research outputs found

    Hypergraph Learning with Line Expansion

    Full text link
    Previous hypergraph expansions are solely carried out on either vertex level or hyperedge level, thereby missing the symmetric nature of data co-occurrence, and resulting in information loss. To address the problem, this paper treats vertices and hyperedges equally and proposes a new hypergraph formulation named the \emph{line expansion (LE)} for hypergraphs learning. The new expansion bijectively induces a homogeneous structure from the hypergraph by treating vertex-hyperedge pairs as "line nodes". By reducing the hypergraph to a simple graph, the proposed \emph{line expansion} makes existing graph learning algorithms compatible with the higher-order structure and has been proven as a unifying framework for various hypergraph expansions. We evaluate the proposed line expansion on five hypergraph datasets, the results show that our method beats SOTA baselines by a significant margin

    Hypergraphs with Edge-Dependent Vertex Weights: p-Laplacians and Spectral Clustering

    Full text link
    We study p-Laplacians and spectral clustering for a recently proposed hypergraph model that incorporates edge-dependent vertex weights (EDVW). These weights can reflect different importance of vertices within a hyperedge, thus conferring the hypergraph model higher expressivity and flexibility. By constructing submodular EDVW-based splitting functions, we convert hypergraphs with EDVW into submodular hypergraphs for which the spectral theory is better developed. In this way, existing concepts and theorems such as p-Laplacians and Cheeger inequalities proposed under the submodular hypergraph setting can be directly extended to hypergraphs with EDVW. For submodular hypergraphs with EDVW-based splitting functions, we propose an efficient algorithm to compute the eigenvector associated with the second smallest eigenvalue of the hypergraph 1-Laplacian. We then utilize this eigenvector to cluster the vertices, achieving higher clustering accuracy than traditional spectral clustering based on the 2-Laplacian. More broadly, the proposed algorithm works for all submodular hypergraphs that are graph reducible. Numerical experiments using real-world data demonstrate the effectiveness of combining spectral clustering based on the 1-Laplacian and EDVW

    Grassmann Integral Representation for Spanning Hyperforests

    Full text link
    Given a hypergraph G, we introduce a Grassmann algebra over the vertex set, and show that a class of Grassmann integrals permits an expansion in terms of spanning hyperforests. Special cases provide the generating functions for rooted and unrooted spanning (hyper)forests and spanning (hyper)trees. All these results are generalizations of Kirchhoff's matrix-tree theorem. Furthermore, we show that the class of integrals describing unrooted spanning (hyper)forests is induced by a theory with an underlying OSP(1|2) supersymmetry.Comment: 50 pages, it uses some latex macros. Accepted for publication on J. Phys.

    Community Detection in Hypergraphs with Application to Partitioning

    Get PDF
    • …
    corecore