4,639 research outputs found
NQE: N-ary Query Embedding for Complex Query Answering over Hyper-relational Knowledge Graphs
Complex query answering (CQA) is an essential task for multi-hop and logical
reasoning on knowledge graphs (KGs). Currently, most approaches are limited to
queries among binary relational facts and pay less attention to n-ary facts
(n>=2) containing more than two entities, which are more prevalent in the real
world. Moreover, previous CQA methods can only make predictions for a few given
types of queries and cannot be flexibly extended to more complex logical
queries, which significantly limits their applications. To overcome these
challenges, in this work, we propose a novel N-ary Query Embedding (NQE) model
for CQA over hyper-relational knowledge graphs (HKGs), which include massive
n-ary facts. The NQE utilizes a dual-heterogeneous Transformer encoder and
fuzzy logic theory to satisfy all n-ary FOL queries, including existential
quantifiers, conjunction, disjunction, and negation. We also propose a parallel
processing algorithm that can train or predict arbitrary n-ary FOL queries in a
single batch, regardless of the kind of each query, with good flexibility and
extensibility. In addition, we generate a new CQA dataset WD50K-NFOL, including
diverse n-ary FOL queries over WD50K. Experimental results on WD50K-NFOL and
other standard CQA datasets show that NQE is the state-of-the-art CQA method
over HKGs with good generalization capability. Our code and dataset are
publicly available.Comment: Accepted by the 37th AAAI Conference on Artificial Intelligence
(AAAI-2023
Shrinking Embeddings for Hyper-Relational Knowledge Graphs
Link prediction on knowledge graphs (KGs) has been extensively studied on
binary relational KGs, wherein each fact is represented by a triple. A
significant amount of important knowledge, however, is represented by
hyper-relational facts where each fact is composed of a primal triple and a set
of qualifiers comprising a key-value pair that allows for expressing more
complicated semantics. Although some recent works have proposed to embed
hyper-relational KGs, these methods fail to capture essential inference
patterns of hyper-relational facts such as qualifier monotonicity, qualifier
implication, and qualifier mutual exclusion, limiting their generalization
capability. To unlock this, we present \emph{ShrinkE}, a geometric
hyper-relational KG embedding method aiming to explicitly model these patterns.
ShrinkE models the primal triple as a spatial-functional transformation from
the head into a relation-specific box. Each qualifier ``shrinks'' the box to
narrow down the possible answer set and, thus, realizes qualifier monotonicity.
The spatial relationships between the qualifier boxes allow for modeling core
inference patterns of qualifiers such as implication and mutual exclusion.
Experimental results demonstrate ShrinkE's superiority on three benchmarks of
hyper-relational KGs.Comment: To appear in ACL 202
Exploring Link Prediction over Hyper-Relational Temporal Knowledge Graphs Enhanced with Time-Invariant Relational Knowledge
Stemming from traditional knowledge graphs (KGs), hyper-relational KGs (HKGs)
provide additional key-value pairs (i.e., qualifiers) for each KG fact that
help to better restrict the fact validity. In recent years, there has been an
increasing interest in studying graph reasoning over HKGs. In the meantime, due
to the ever-evolving nature of world knowledge, extensive parallel works have
been focusing on reasoning over temporal KGs (TKGs), where each TKG fact can be
viewed as a KG fact coupled with a timestamp (or time period) specifying its
time validity. The existing HKG reasoning approaches do not consider temporal
information because it is not explicitly specified in previous benchmark
datasets. Besides, all the previous TKG reasoning methods only lay emphasis on
temporal reasoning and have no way to learn from qualifiers. To this end, we
aim to fill the gap between TKG reasoning and HKG reasoning. We develop two new
benchmark hyper-relational TKG (HTKG) datasets, i.e., Wiki-hy and YAGO-hy, and
propose a HTKG reasoning model that efficiently models both temporal facts and
qualifiers. We further exploit additional time-invariant relational knowledge
from the Wikidata knowledge base and study its effectiveness in HTKG reasoning.
Time-invariant relational knowledge serves as the knowledge that remains
unchanged in time (e.g., Sasha Obama is the child of Barack Obama), and it has
never been fully explored in previous TKG reasoning benchmarks and approaches.
Experimental results show that our model substantially outperforms previous
related methods on HTKG link prediction and can be enhanced by jointly
leveraging both temporal and time-invariant relational knowledge
Text2NKG: Fine-Grained N-ary Relation Extraction for N-ary relational Knowledge Graph Construction
Beyond traditional binary relational facts, n-ary relational knowledge graphs
(NKGs) are comprised of n-ary relational facts containing more than two
entities, which are closer to real-world facts with broader applications.
However, the construction of NKGs still significantly relies on manual labor,
and n-ary relation extraction still remains at a course-grained level, which is
always in a single schema and fixed arity of entities. To address these
restrictions, we propose Text2NKG, a novel fine-grained n-ary relation
extraction framework for n-ary relational knowledge graph construction. We
introduce a span-tuple classification approach with hetero-ordered merging to
accomplish fine-grained n-ary relation extraction in different arity.
Furthermore, Text2NKG supports four typical NKG schemas: hyper-relational
schema, event-based schema, role-based schema, and hypergraph-based schema,
with high flexibility and practicality. Experimental results demonstrate that
Text2NKG outperforms the previous state-of-the-art model by nearly 20\% points
in the scores on the fine-grained n-ary relation extraction benchmark in
the hyper-relational schema. Our code and datasets are publicly available.Comment: Preprin
- …