3,413 research outputs found
Hypergraph Neural Networks
In this paper, we present a hypergraph neural networks (HGNN) framework for
data representation learning, which can encode high-order data correlation in a
hypergraph structure. Confronting the challenges of learning representation for
complex data in real practice, we propose to incorporate such data structure in
a hypergraph, which is more flexible on data modeling, especially when dealing
with complex data. In this method, a hyperedge convolution operation is
designed to handle the data correlation during representation learning. In this
way, traditional hypergraph learning procedure can be conducted using hyperedge
convolution operations efficiently. HGNN is able to learn the hidden layer
representation considering the high-order data structure, which is a general
framework considering the complex data correlations. We have conducted
experiments on citation network classification and visual object recognition
tasks and compared HGNN with graph convolutional networks and other traditional
methods. Experimental results demonstrate that the proposed HGNN method
outperforms recent state-of-the-art methods. We can also reveal from the
results that the proposed HGNN is superior when dealing with multi-modal data
compared with existing methods.Comment: Accepted in AAAI'201
Stability and Generalization of Hypergraph Collaborative Networks
Graph neural networks have been shown to be very effective in utilizing
pairwise relationships across samples. Recently, there have been several
successful proposals to generalize graph neural networks to hypergraph neural
networks to exploit more complex relationships. In particular, the hypergraph
collaborative networks yield superior results compared to other hypergraph
neural networks for various semi-supervised learning tasks. The collaborative
network can provide high quality vertex embeddings and hyperedge embeddings
together by formulating them as a joint optimization problem and by using their
consistency in reconstructing the given hypergraph. In this paper, we aim to
establish the algorithmic stability of the core layer of the collaborative
network and provide generalization guarantees. The analysis sheds light on the
design of hypergraph filters in collaborative networks, for instance, how the
data and hypergraph filters should be scaled to achieve uniform stability of
the learning process. Some experimental results on real-world datasets are
presented to illustrate the theory
HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network
In data science, hypergraphs are natural models for data exhibiting multi-way
relations, whereas graphs only capture pairwise. Nonetheless, many proposed
hypergraph neural networks effectively reduce hypergraphs to undirected graphs
via symmetrized matrix representations, potentially losing important
information. We propose an alternative approach to hypergraph neural networks
in which the hypergraph is represented as a non-reversible Markov chain. We use
this Markov chain to construct a complex Hermitian Laplacian matrix - the
magnetic Laplacian - which serves as the input to our proposed hypergraph
neural network. We study HyperMagNet for the task of node classification, and
demonstrate its effectiveness over graph-reduction based hypergraph neural
networks.Comment: 9 pages, 1 figur
Directed hypergraph neural network
To deal with irregular data structure, graph convolution neural networks have
been developed by a lot of data scientists. However, data scientists just have
concentrated primarily on developing deep neural network method for un-directed
graph. In this paper, we will present the novel neural network method for
directed hypergraph. In the other words, we will develop not only the novel
directed hypergraph neural network method but also the novel directed
hypergraph based semi-supervised learning method. These methods are employed to
solve the node classification task. The two datasets that are used in the
experiments are the cora and the citeseer datasets. Among the classic directed
graph based semi-supervised learning method, the novel directed hypergraph
based semi-supervised learning method, the novel directed hypergraph neural
network method that are utilized to solve this node classification task, we
recognize that the novel directed hypergraph neural network achieves the
highest accuracies
Hypergraph Learning with Line Expansion
Previous hypergraph expansions are solely carried out on either vertex level
or hyperedge level, thereby missing the symmetric nature of data co-occurrence,
and resulting in information loss. To address the problem, this paper treats
vertices and hyperedges equally and proposes a new hypergraph formulation named
the \emph{line expansion (LE)} for hypergraphs learning. The new expansion
bijectively induces a homogeneous structure from the hypergraph by treating
vertex-hyperedge pairs as "line nodes". By reducing the hypergraph to a simple
graph, the proposed \emph{line expansion} makes existing graph learning
algorithms compatible with the higher-order structure and has been proven as a
unifying framework for various hypergraph expansions. We evaluate the proposed
line expansion on five hypergraph datasets, the results show that our method
beats SOTA baselines by a significant margin
Equivariant Hypergraph Diffusion Neural Operators
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs
provide a promising way to model higher-order relations in data and further
solve relevant prediction tasks built upon such higher-order relations.
However, higher-order relations in practice contain complex patterns and are
often highly irregular. So, it is often challenging to design an HNN that
suffices to express those relations while keeping computational efficiency.
Inspired by hypergraph diffusion algorithms, this work proposes a new HNN
architecture named ED-HNN, which provably represents any continuous equivariant
hypergraph diffusion operators that can model a wide range of higher-order
relations. ED-HNN can be implemented efficiently by combining star expansions
of hypergraphs with standard message passing neural networks. ED-HNN further
shows great superiority in processing heterophilic hypergraphs and constructing
deep models. We evaluate ED-HNN for node classification on nine real-world
hypergraph datasets. ED-HNN uniformly outperforms the best baselines over these
nine datasets and achieves more than 2\% in prediction accuracy over
four datasets therein.Comment: Code: https://github.com/Graph-COM/ED-HN
- …