11 research outputs found
Generalization of graph network inferences in higher-order probabilistic graphical models
Probabilistic graphical models provide a powerful tool to describe complex
statistical structure, with many real-world applications in science and
engineering from controlling robotic arms to understanding neuronal
computations. A major challenge for these graphical models is that inferences
such as marginalization are intractable for general graphs. These inferences
are often approximated by a distributed message-passing algorithm such as
Belief Propagation, which does not always perform well on graphs with cycles,
nor can it always be easily specified for complex continuous probability
distributions. Such difficulties arise frequently in expressive graphical
models that include intractable higher-order interactions. In this paper we
construct iterative message-passing algorithms using Graph Neural Networks
defined on factor graphs to achieve fast approximate inference on graphical
models that involve many-variable interactions. Experimental results on several
families of graphical models demonstrate the out-of-distribution generalization
capability of our method to different sized graphs, and indicate the domain in
which our method gains advantage over Belief Propagation.Comment: 9 pages, 2 figure
H2TNE: Temporal Heterogeneous Information Network Embedding in Hyperbolic Spaces
Temporal heterogeneous information network (temporal HIN) embedding, aiming
to represent various types of nodes of different timestamps into low
dimensional spaces while preserving structural and semantic information, is of
vital importance in diverse real-life tasks. Researchers have made great
efforts on temporal HIN embedding in Euclidean spaces and got some considerable
achievements. However, there is always a fundamental conflict that many
real-world networks show hierarchical property and power-law distribution, and
are not isometric of Euclidean spaces. Recently, representation learning in
hyperbolic spaces has been proved to be valid for data with hierarchical and
power-law structure. Inspired by this character, we propose a hyperbolic
heterogeneous temporal network embedding (H2TNE) model for temporal HINs.
Specifically, we leverage a temporally and heterogeneously double-constrained
random walk strategy to capture the structural and semantic information, and
then calculate the embedding by exploiting hyperbolic distance in proximity
measurement. Experimental results show that our method has superior performance
on temporal link prediction and node classification compared with SOTA models.Comment: arXiv admin note: text overlap with arXiv:1705.08039 by other author
Poincar\'e ResNet
This paper introduces an end-to-end residual network that operates entirely
on the Poincar\'e ball model of hyperbolic space. Hyperbolic learning has
recently shown great potential for visual understanding, but is currently only
performed in the penultimate layer(s) of deep networks. All visual
representations are still learned through standard Euclidean networks. In this
paper we investigate how to learn hyperbolic representations of visual data
directly from the pixel-level. We propose Poincar\'e ResNet, a hyperbolic
counterpart of the celebrated residual network, starting from Poincar\'e 2D
convolutions up to Poincar\'e residual connections. We identify three
roadblocks for training convolutional networks entirely in hyperbolic space and
propose a solution for each: (i) Current hyperbolic network initializations
collapse to the origin, limiting their applicability in deeper networks. We
provide an identity-based initialization that preserves norms over many layers.
(ii) Residual networks rely heavily on batch normalization, which comes with
expensive Fr\'echet mean calculations in hyperbolic space. We introduce
Poincar\'e midpoint batch normalization as a faster and equally effective
alternative. (iii) Due to the many intermediate operations in Poincar\'e
layers, we lastly find that the computation graphs of deep learning libraries
blow up, limiting our ability to train on deep hyperbolic networks. We provide
manual backward derivations of core hyperbolic operations to maintain
manageable computation graphs.Comment: International Conference on Computer Vision 202
Matching Biomedical Ontologies via a Hybrid Graph Attention Network
Biomedical ontologies have been used extensively to formally define and organize biomedical terminologies, and these ontologies are typically manually created by biomedical experts. With more biomedical ontologies being built independently, matching them to address the problem of heterogeneity and interoperability has become a critical challenge in many biomedical applications. Existing matching methods have mostly focused on capturing features of terminological, structural, and contextual semantics in ontologies. However, these feature engineering-based techniques are not only labor-intensive but also ignore the hidden semantic relations in ontologies. In this study, we propose an alternative biomedical ontology-matching framework BioHAN via a hybrid graph attention network, and that consists of three techniques. First, we propose an effective ontology-enriching method that refines and enriches the ontologies through axioms and external resources. Subsequently, we use hyperbolic graph attention layers to encode hierarchical concepts in a unified hyperbolic space. Finally, we aggregate the features of both the direct and distant neighbors with a graph attention network. Experimental results on real-world biomedical ontologies demonstrate that BioHAN is competitive with the state-of-the-art ontology matching methods
Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space
Representation learning over temporal networks has drawn considerable
attention in recent years. Efforts are mainly focused on modeling structural
dependencies and temporal evolving regularities in Euclidean space which,
however, underestimates the inherent complex and hierarchical properties in
many real-world temporal networks, leading to sub-optimal embeddings. To
explore these properties of a complex temporal network, we propose a hyperbolic
temporal graph network (HTGN) that fully takes advantage of the exponential
capacity and hierarchical awareness of hyperbolic geometry. More specially,
HTGN maps the temporal graph into hyperbolic space, and incorporates hyperbolic
graph neural network and hyperbolic gated recurrent neural network, to capture
the evolving behaviors and implicitly preserve hierarchical information
simultaneously. Furthermore, in the hyperbolic space, we propose two important
modules that enable HTGN to successfully model temporal networks: (1)
hyperbolic temporal contextual self-attention (HTA) module to attend to
historical states and (2) hyperbolic temporal consistency (HTC) module to
ensure stability and generalization. Experimental results on multiple
real-world datasets demonstrate the superiority of HTGN for temporal graph
embedding, as it consistently outperforms competing methods by significant
margins in various temporal link prediction tasks. Specifically, HTGN achieves
AUC improvement up to 9.98% for link prediction and 11.4% for new link
prediction. Moreover, the ablation study further validates the representational
ability of hyperbolic geometry and the effectiveness of the proposed HTA and
HTC modules.Comment: KDD202
\{kappa}HGCN: Tree-likeness Modeling via Continuous and Discrete Curvature Learning
The prevalence of tree-like structures, encompassing hierarchical structures
and power law distributions, exists extensively in real-world applications,
including recommendation systems, ecosystems, financial networks, social
networks, etc. Recently, the exploitation of hyperbolic space for tree-likeness
modeling has garnered considerable attention owing to its exponential growth
volume. Compared to the flat Euclidean space, the curved hyperbolic space
provides a more amenable and embeddable room, especially for datasets
exhibiting implicit tree-like architectures. However, the intricate nature of
real-world tree-like data presents a considerable challenge, as it frequently
displays a heterogeneous composition of tree-like, flat, and circular regions.
The direct embedding of such heterogeneous structures into a homogeneous
embedding space (i.e., hyperbolic space) inevitably leads to heavy distortions.
To mitigate the aforementioned shortage, this study endeavors to explore the
curvature between discrete structure and continuous learning space, aiming at
encoding the message conveyed by the network topology in the learning process,
thereby improving tree-likeness modeling. To the end, a curvature-aware
hyperbolic graph convolutional neural network, \{kappa}HGCN, is proposed, which
utilizes the curvature to guide message passing and improve long-range
propagation. Extensive experiments on node classification and link prediction
tasks verify the superiority of the proposal as it consistently outperforms
various competitive models by a large margin.Comment: KDD 202
Hyperbolic Translation-Based Sequential Recommendation
The goal of sequential recommendation algorithms is to predict personalized sequential behaviors of users (i.e., next-item recommendation). Learning representations of entities (i.e., users and items) from sparse interaction behaviors and capturing the relationships between entities are the main challenges for sequential recommendation. However, most sequential recommendation algorithms model relationships among entities in Euclidean space, where it is difficult to capture hierarchical relationships among entities. Moreover, most of them utilize independent components to model the user preferences and the sequential behaviors, ignoring the correlation between them. To simultaneously capture the hierarchical structure relationships and model the user preferences and the sequential behaviors in a unified framework, we propose a general hyperbolic translation-based sequential recommendation framework, namely HTSR. Specifically, we first measure the distance between entities in hyperbolic space. Then, we utilize personalized hyperbolic translation operations to model the third-order relationships among a user, his/her latest visited item, and the next item to consume. In addition, we instantiate two hyperbolic translation-based sequential recommendation models, namely Poincaré translation-based sequential recommendation (PoTSR) and Lorentzian translation-based sequential recommendation (LoTSR). PoTSR and LoTSR utilize the Poincaré distance and Lorentzian distance to measure similarities between entities, respectively. Moreover, we utilize the tangent space optimization method to determine optimal model parameters. Experimental results on five real-world datasets show that our proposed hyperbolic translation-based sequential recommendation methods outperform the state-of-the-art sequential recommendation algorithms
Hyperbolic Deep Neural Networks: A Survey
Recently, there has been a rising surge of momentum for deep representation
learning in hyperbolic spaces due to theirhigh capacity of modeling data like
knowledge graphs or synonym hierarchies, possessing hierarchical structure. We
refer to the model as hyperbolic deep neural network in this paper. Such a
hyperbolic neural architecture potentially leads to drastically compact model
withmuch more physical interpretability than its counterpart in Euclidean
space. To stimulate future research, this paper presents acoherent and
comprehensive review of the literature around the neural components in the
construction of hyperbolic deep neuralnetworks, as well as the generalization
of the leading deep approaches to the Hyperbolic space. It also presents
current applicationsaround various machine learning tasks on several publicly
available datasets, together with insightful observations and identifying
openquestions and promising future directions