8 research outputs found

    Equivariant Hypergraph Diffusion Neural Operators

    Full text link
    Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data and further solve relevant prediction tasks built upon such higher-order relations. However, higher-order relations in practice contain complex patterns and are often highly irregular. So, it is often challenging to design an HNN that suffices to express those relations while keeping computational efficiency. Inspired by hypergraph diffusion algorithms, this work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators that can model a wide range of higher-order relations. ED-HNN can be implemented efficiently by combining star expansions of hypergraphs with standard message passing neural networks. ED-HNN further shows great superiority in processing heterophilic hypergraphs and constructing deep models. We evaluate ED-HNN for node classification on nine real-world hypergraph datasets. ED-HNN uniformly outperforms the best baselines over these nine datasets and achieves more than 2\%↑\uparrow in prediction accuracy over four datasets therein.Comment: Code: https://github.com/Graph-COM/ED-HN

    Learning on graphs with high-order relations: spectral methods, optimization and applications

    Get PDF
    Learning on graphs is an important problem in machine learning, computer vision and data mining. Traditional algorithms for learning on graphs primarily take into account only low-order connectivity patterns described at the level of individual vertices and edges. However, in many applications, high-order relations among vertices are necessary to properly model a real-life problem. In contrast to the low-order cases, in-depth algorithmic and analytic studies supporting high-order relations among vertices are still lacking. To address this problem, we introduce a new mathematical model family, termed inhomogeneous hypergraphs, which captures the high-order relations among vertices in a very extensive and flexible way. Specifically, as opposed to classic hypergraphs that treat vertices within a high-order structure in a uniform manner, inhomogeneous hypergraphs allow one to model the fact that different subsets of vertices within a high-order relation may have different structural importance. We propose a series of algorithms and relevant analytic results for this new model. First, after we introduce the formal definitions and some preliminaries, we propose clustering algorithms over inhomogeneous hypergraphs. The first clustering method is based on a projection method, where we use graphs with pairwise relations to approximate high-order relations and then directly use spectral clustering methods over obtained graphs. For this type of method, we provide provable performance guarantee, which works for a sub-class of inhomogeneous hypergraphs that additionally impose constraints on the internal structures of high-order relations. Such constraints are related to submodular functions, so we term such a sub-class of inhomogeneous hypergraphs as submodular hypergraphs. Later, we study the Laplacian operators for these hypergraphs and generalize many important results in spectral theory for this setting including Cheeger's inequalities and discrete nodal domain theorems. Based on these new results, we further develop new clustering algorithms with tighter approximating properties than projection methods. Second, we propose some optimization algorithms for inhomogeneous hypergraphs. We first find that min-cut problems over submodular hypergraphs are closely related to an extensively studied optimization problem termed decomposable submodular hypergraph minimization (DSFM). Our contribution is how to leverage hypergraph structures to accelerate canonical solvers for DSFM problems. Later, we connect PageRank approaches to submodular hypergraphs and propose a new optimization problem termed quadratic decomposable submodular hypergraph minimization (QDSFM). For this new problem, we propose algorithms with first provable linear convergence guarantee and identify new relevant applications
    corecore