13,369 research outputs found
Laplacian Features for Learning with Hyperbolic Space
Due to its geometric properties, hyperbolic space can support high-fidelity
embeddings of tree- and graph-structured data. As a result, various hyperbolic
networks have been developed which outperform Euclidean networks on many tasks:
e.g. hyperbolic graph convolutional networks (GCN) can outperform vanilla GCN
on some graph learning tasks. However, most existing hyperbolic networks are
complicated, computationally expensive, and numerically unstable -- and they
cannot scale to large graphs due to these shortcomings. With more and more
hyperbolic networks proposed, it is becoming less and less clear what key
component is necessary to make the model behave. In this paper, we propose
HyLa, a simple and minimal approach to using hyperbolic space in networks: HyLa
maps once from a hyperbolic-space embedding to Euclidean space via the
eigenfunctions of the Laplacian operator in the hyperbolic space. We evaluate
HyLa on graph learning tasks including node classification and text
classification, where HyLa can be used together with any graph neural networks.
When used with a linear model, HyLa shows significant improvements over
hyperbolic networks and other baselines
Hyperbolic Graph Representation Learning: A Tutorial
Graph-structured data are widespread in real-world applications, such as
social networks, recommender systems, knowledge graphs, chemical molecules etc.
Despite the success of Euclidean space for graph-related learning tasks, its
ability to model complex patterns is essentially constrained by its
polynomially growing capacity. Recently, hyperbolic spaces have emerged as a
promising alternative for processing graph data with tree-like structure or
power-law distribution, owing to the exponential growth property. Different
from Euclidean space, which expands polynomially, the hyperbolic space grows
exponentially which makes it gains natural advantages in abstracting tree-like
or scale-free graphs with hierarchical organizations.
In this tutorial, we aim to give an introduction to this emerging field of
graph representation learning with the express purpose of being accessible to
all audiences. We first give a brief introduction to graph representation
learning as well as some preliminary Riemannian and hyperbolic geometry. We
then comprehensively revisit the hyperbolic embedding techniques, including
hyperbolic shallow models and hyperbolic neural networks. In addition, we
introduce the technical details of the current hyperbolic graph neural networks
by unifying them into a general framework and summarizing the variants of each
component. Moreover, we further introduce a series of related applications in a
variety of fields. In the last part, we discuss several advanced topics about
hyperbolic geometry for graph representation learning, which potentially serve
as guidelines for further flourishing the non-Euclidean graph learning
community.Comment: Accepted as ECML-PKDD 2022 Tutoria
Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space
Representation learning over temporal networks has drawn considerable
attention in recent years. Efforts are mainly focused on modeling structural
dependencies and temporal evolving regularities in Euclidean space which,
however, underestimates the inherent complex and hierarchical properties in
many real-world temporal networks, leading to sub-optimal embeddings. To
explore these properties of a complex temporal network, we propose a hyperbolic
temporal graph network (HTGN) that fully takes advantage of the exponential
capacity and hierarchical awareness of hyperbolic geometry. More specially,
HTGN maps the temporal graph into hyperbolic space, and incorporates hyperbolic
graph neural network and hyperbolic gated recurrent neural network, to capture
the evolving behaviors and implicitly preserve hierarchical information
simultaneously. Furthermore, in the hyperbolic space, we propose two important
modules that enable HTGN to successfully model temporal networks: (1)
hyperbolic temporal contextual self-attention (HTA) module to attend to
historical states and (2) hyperbolic temporal consistency (HTC) module to
ensure stability and generalization. Experimental results on multiple
real-world datasets demonstrate the superiority of HTGN for temporal graph
embedding, as it consistently outperforms competing methods by significant
margins in various temporal link prediction tasks. Specifically, HTGN achieves
AUC improvement up to 9.98% for link prediction and 11.4% for new link
prediction. Moreover, the ablation study further validates the representational
ability of hyperbolic geometry and the effectiveness of the proposed HTA and
HTC modules.Comment: KDD202
Neural Embeddings of Graphs in Hyperbolic Space
Neural embeddings have been used with great success in Natural Language
Processing (NLP). They provide compact representations that encapsulate word
similarity and attain state-of-the-art performance in a range of linguistic
tasks. The success of neural embeddings has prompted significant amounts of
research into applications in domains other than language. One such domain is
graph-structured data, where embeddings of vertices can be learned that
encapsulate vertex similarity and improve performance on tasks including edge
prediction and vertex labelling. For both NLP and graph based tasks, embeddings
have been learned in high-dimensional Euclidean spaces. However, recent work
has shown that the appropriate isometric space for embedding complex networks
is not the flat Euclidean space, but negatively curved, hyperbolic space. We
present a new concept that exploits these recent insights and propose learning
neural embeddings of graphs in hyperbolic space. We provide experimental
evidence that embedding graphs in their natural geometry significantly improves
performance on downstream tasks for several real-world public datasets.Comment: 7 pages, 5 figure
A Unification Framework for Euclidean and Hyperbolic Graph Neural Networks
Hyperbolic neural networks are able to capture the inherent hierarchy of
graph datasets, and consequently a powerful choice of GNNs. However, they
entangle multiple incongruent (gyro-)vector spaces within a layer, which makes
them limited in terms of generalization and scalability. In this work, we
propose to use Poincar\'e disk model as our search space, and apply all
approximations on the disk (as if the disk is a tangent space derived from the
origin), and thus getting rid of all inter-space transformations. Such an
approach enables us to propose a hyperbolic normalization layer, and to further
simplify the entire hyperbolic model to a Euclidean model cascaded with our
hyperbolic normalization layer. We applied our proposed nonlinear hyperbolic
normalization to the current state-of-the-art homogeneous and multi-relational
graph networks. We demonstrate that not only does the model leverage the power
of Euclidean networks such as interpretability and efficient execution of
various model components, but also it outperforms both Euclidean and hyperbolic
counterparts in our benchmarks
- …