166 research outputs found
Complex Logical Reasoning over Knowledge Graphs using Large Language Models
Reasoning over knowledge graphs (KGs) is a challenging task that requires a
deep understanding of the complex relationships between entities and the
underlying logic of their relations. Current approaches rely on learning
geometries to embed entities in vector space for logical query operations, but
they suffer from subpar performance on complex queries and dataset-specific
representations. In this paper, we propose a novel decoupled approach,
Language-guided Abstract Reasoning over Knowledge graphs (LARK), that
formulates complex KG reasoning as a combination of contextual KG search and
abstract logical query reasoning, to leverage the strengths of graph extraction
algorithms and large language models (LLM), respectively. Our experiments
demonstrate that the proposed approach outperforms state-of-the-art KG
reasoning methods on standard benchmark datasets across several logical query
constructs, with significant performance gain for queries of higher complexity.
Furthermore, we show that the performance of our approach improves
proportionally to the increase in size of the underlying LLM, enabling the
integration of the latest advancements in LLMs for logical reasoning over KGs.
Our work presents a new direction for addressing the challenges of complex KG
reasoning and paves the way for future research in this area.Comment: Code available at https://github.com/Akirato/LLM-KG-Reasonin
Graph-based Multi-ODE Neural Networks for Spatio-Temporal Traffic Forecasting
There is a recent surge in the development of spatio-temporal forecasting
models in the transportation domain. Long-range traffic forecasting, however,
remains a challenging task due to the intricate and extensive spatio-temporal
correlations observed in traffic networks. Current works primarily rely on road
networks with graph structures and learn representations using graph neural
networks (GNNs), but this approach suffers from over-smoothing problem in deep
architectures. To tackle this problem, recent methods introduced the
combination of GNNs with residual connections or neural ordinary differential
equations (ODE). However, current graph ODE models face two key limitations in
feature extraction: (1) they lean towards global temporal patterns, overlooking
local patterns that are important for unexpected events; and (2) they lack
dynamic semantic edges in their architectural design. In this paper, we propose
a novel architecture called Graph-based Multi-ODE Neural Networks (GRAM-ODE)
which is designed with multiple connective ODE-GNN modules to learn better
representations by capturing different views of complex local and global
dynamic spatio-temporal dependencies. We also add some techniques like shared
weights and divergence constraints into the intermediate layers of distinct
ODE-GNN modules to further improve their communication towards the forecasting
task. Our extensive set of experiments conducted on six real-world datasets
demonstrate the superior performance of GRAM-ODE compared with state-of-the-art
baselines as well as the contribution of different components to the overall
performance. The code is available at https://github.com/zbliu98/GRAM-ODEComment: Published at Transactions on Machine Learning Research, 202
Hyperbolic Graph Neural Networks at Scale: A Meta Learning Approach
The progress in hyperbolic neural networks (HNNs) research is hindered by
their absence of inductive bias mechanisms, which are essential for
generalizing to new tasks and facilitating scalable learning over large
datasets. In this paper, we aim to alleviate these issues by learning
generalizable inductive biases from the nodes' local subgraph and transfer them
for faster learning over new subgraphs with a disjoint set of nodes, edges, and
labels in a few-shot setting. We introduce a novel method, Hyperbolic GRAph
Meta Learner (H-GRAM), that, for the tasks of node classification and link
prediction, learns transferable information from a set of support local
subgraphs in the form of hyperbolic meta gradients and label hyperbolic
protonets to enable faster learning over a query set of new tasks dealing with
disjoint subgraphs. Furthermore, we show that an extension of our meta-learning
framework also mitigates the scalability challenges seen in HNNs faced by
existing approaches. Our comparative analysis shows that H-GRAM effectively
learns and transfers information in multiple challenging few-shot settings
compared to other state-of-the-art baselines. Additionally, we demonstrate
that, unlike standard HNNs, our approach is able to scale over large graph
datasets and improve performance over its Euclidean counterparts.Comment: Accepted to NeurIPS 2023. 14 pages of main paper, 5 pages of
supplementar
- …