16 research outputs found
Size-Aware Hypergraph Motifs
Complex systems frequently exhibit multi-way, rather than pairwise,
interactions. These group interactions cannot be faithfully modeled as
collections of pairwise interactions using graphs, and instead require
hypergraphs. However, methods that analyze hypergraphs directly, rather than
via lossy graph reductions, remain limited. Hypergraph motif mining holds
promise in this regard, as motif patterns serve as building blocks for larger
group interactions which are inexpressible by graphs. Recent work has focused
on categorizing and counting hypergraph motifs based on the existence of nodes
in hyperedge intersection regions. Here, we argue that the relative sizes of
hyperedge intersections within motifs contain varied and valuable information.
We propose a suite of efficient algorithms for finding triplets of hyperedges
based on optimizing the sizes of these intersection patterns. This formulation
uncovers interesting local patterns of interaction, finding hyperedge triplets
that either (1) are the least correlated with each other, (2) have the highest
pairwise but not groupwise correlation, or (3) are the most correlated with
each other. We formalize this as a combinatorial optimization problem and
design efficient algorithms based on filtering hyperedges. Our experimental
evaluation shows that the resulting hyperedge triplets yield insightful
information on real-world hypergraphs. Our approach is also orders of magnitude
faster than a naive baseline implementation
HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network
In data science, hypergraphs are natural models for data exhibiting multi-way
relations, whereas graphs only capture pairwise. Nonetheless, many proposed
hypergraph neural networks effectively reduce hypergraphs to undirected graphs
via symmetrized matrix representations, potentially losing important
information. We propose an alternative approach to hypergraph neural networks
in which the hypergraph is represented as a non-reversible Markov chain. We use
this Markov chain to construct a complex Hermitian Laplacian matrix - the
magnetic Laplacian - which serves as the input to our proposed hypergraph
neural network. We study HyperMagNet for the task of node classification, and
demonstrate its effectiveness over graph-reduction based hypergraph neural
networks.Comment: 9 pages, 1 figur
Fast Parallel Tensor Times Same Vector for Hypergraphs
Hypergraphs are a popular paradigm to represent complex real-world networks
exhibiting multi-way relationships of varying sizes. Mining centrality in
hypergraphs via symmetric adjacency tensors has only recently become
computationally feasible for large and complex datasets. To enable scalable
computation of these and related hypergraph analytics, here we focus on the
Sparse Symmetric Tensor Times Same Vector (STTVc) operation. We introduce
the Compound Compressed Sparse Symmetric (CCSS) format, an extension of the
compact CSS format for hypergraphs of varying hyperedge sizes and present a
shared-memory parallel algorithm to compute STTVc. We experimentally show
STTVc computation using the CCSS format achieves better performance than
the naive baseline, and is subsequently more performant for hypergraph
-eigenvector centrality