25,243 research outputs found
Topology-Aware Correlations Between Relations for Inductive Link Prediction in Knowledge Graphs
Inductive link prediction -- where entities during training and inference
stages can be different -- has been shown to be promising for completing
continuously evolving knowledge graphs. Existing models of inductive reasoning
mainly focus on predicting missing links by learning logical rules. However,
many existing approaches do not take into account semantic correlations between
relations, which are commonly seen in real-world knowledge graphs. To address
this challenge, we propose a novel inductive reasoning approach, namely TACT,
which can effectively exploit Topology-Aware CorrelaTions between relations in
an entity-independent manner. TACT is inspired by the observation that the
semantic correlation between two relations is highly correlated to their
topological structure in knowledge graphs. Specifically, we categorize all
relation pairs into several topological patterns, and then propose a Relational
Correlation Network (RCN) to learn the importance of the different patterns for
inductive link prediction. Experiments demonstrate that TACT can effectively
model semantic correlations between relations, and significantly outperforms
existing state-of-the-art methods on benchmark datasets for the inductive link
prediction task.Comment: Accepted to AAAI 202
Inductive Relation Prediction from Relational Paths and Context with Hierarchical Transformers
Relation prediction on knowledge graphs (KGs) is a key research topic.
Dominant embedding-based methods mainly focus on the transductive setting and
lack the inductive ability to generalize to new entities for inference.
Existing methods for inductive reasoning mostly mine the connections between
entities, i.e., relational paths, without considering the nature of head and
tail entities contained in the relational context. This paper proposes a novel
method that captures both connections between entities and the intrinsic nature
of entities, by simultaneously aggregating RElational Paths and cOntext with a
unified hieRarchical Transformer framework, namely REPORT. REPORT relies solely
on relation semantics and can naturally generalize to the fully-inductive
setting, where KGs for training and inference have no common entities. In the
experiments, REPORT performs consistently better than all baselines on almost
all the eight version subsets of two fully-inductive datasets. Moreover. REPORT
is interpretable by providing each element's contribution to the prediction
results.Comment: Accepted by ICASSP 2023 (Oral
RAILD: Towards Leveraging Relation Features for Inductive Link Prediction In Knowledge Graphs
Due to the open world assumption, Knowledge Graphs (KGs) are never complete.
In order to address this issue, various Link Prediction (LP) methods are
proposed so far. Some of these methods are inductive LP models which are
capable of learning representations for entities not seen during training.
However, to the best of our knowledge, none of the existing inductive LP models
focus on learning representations for unseen relations. In this work, a novel
Relation Aware Inductive Link preDiction (RAILD) is proposed for KG completion
which learns representations for both unseen entities and unseen relations. In
addition to leveraging textual literals associated with both entities and
relations by employing language models, RAILD also introduces a novel
graph-based approach to generate features for relations. Experiments are
conducted with different existing and newly created challenging benchmark
datasets and the results indicate that RAILD leads to performance improvement
over the state-of-the-art models. Moreover, since there are no existing
inductive LP models which learn representations for unseen relations, we have
created our own baselines and the results obtained with RAILD also outperform
these baselines
Incorporating Structured Sentences with Time-enhanced BERT for Fully-inductive Temporal Relation Prediction
Temporal relation prediction in incomplete temporal knowledge graphs (TKGs)
is a popular temporal knowledge graph completion (TKGC) problem in both
transductive and inductive settings. Traditional embedding-based TKGC models
(TKGE) rely on structured connections and can only handle a fixed set of
entities, i.e., the transductive setting. In the inductive setting where test
TKGs contain emerging entities, the latest methods are based on symbolic rules
or pre-trained language models (PLMs). However, they suffer from being
inflexible and not time-specific, respectively. In this work, we extend the
fully-inductive setting, where entities in the training and test sets are
totally disjoint, into TKGs and take a further step towards a more flexible and
time-sensitive temporal relation prediction approach SST-BERT, incorporating
Structured Sentences with Time-enhanced BERT. Our model can obtain the entity
history and implicitly learn rules in the semantic space by encoding structured
sentences, solving the problem of inflexibility. We propose to use a time
masking MLM task to pre-train BERT in a corpus rich in temporal tokens
specially generated for TKGs, enhancing the time sensitivity of SST-BERT. To
compute the probability of occurrence of a target quadruple, we aggregate all
its structured sentences from both temporal and semantic perspectives into a
score. Experiments on the transductive datasets and newly generated
fully-inductive benchmarks show that SST-BERT successfully improves over
state-of-the-art baselines
Evaluating Hybrid AI For Prediction Over Lung Cancer Knowledge Graphs
Link prediction is of great importance in the field of knowledge graphs, as it plays a key role in facilitating knowledge discovery and supporting decision-making, especially in healthcare. Although knowledge graphs provide a structured representation of data, challenges arise from data integration and quality assurance issues. The presence of inaccuracies, outdated information and inconsistencies poses a threat to data quality, requiring ongoing efforts to address incomplete or missing data.
The challenges posed by data quality issues are multifaceted and contribute to an overall reduction in the reliability of information. In the era of big data and artificial intelligence, dealing with incomplete information and missing data is a challenge. Inductive learning, a form of machine learning that involves making generalizations based on specific examples, can be a valuable approach for link prediction to overcome some obstacles associated with knowledge graphs in healthcare.
In response to these challenges, link prediction is becoming as a valuable technique to improve the quality of knowledge graphs by filling in missing links. The state-of-the-art proposes various approaches for knowledge graph completion and link predictions involves the evaluation of different embeddings and symbolic learning models. Experimental benchmarks are designed to evaluate different models and relations types and provide insights into their effectiveness.
This research aims to develop a framework for evaluation of hybrid AI models over lung cancer knowledge graph. The primary objectives include comparative analysis of embeddings and symbolic learning models, investigation of the impact of data modelling, exploration of the influence of relation types, and evaluation of the impact of knowledge graph enhancing
Communicative Message Passing for Inductive Relation Reasoning
Relation prediction for knowledge graphs aims at predicting missing
relationships between entities. Despite the importance of inductive relation
prediction, most previous works are limited to a transductive setting and
cannot process previously unseen entities. The recent proposed subgraph-based
relation reasoning models provided alternatives to predict links from the
subgraph structure surrounding a candidate triplet inductively. However, we
observe that these methods often neglect the directed nature of the extracted
subgraph and weaken the role of relation information in the subgraph modeling.
As a result, they fail to effectively handle the asymmetric/anti-symmetric
triplets and produce insufficient embeddings for the target triplets. To this
end, we introduce a \textbf{C}\textbf{o}mmunicative \textbf{M}essage
\textbf{P}assing neural network for \textbf{I}nductive re\textbf{L}ation
r\textbf{E}asoning, \textbf{CoMPILE}, that reasons over local directed subgraph
structures and has a vigorous inductive bias to process entity-independent
semantic relations. In contrast to existing models, CoMPILE strengthens the
message interactions between edges and entitles through a communicative kernel
and enables a sufficient flow of relation information. Moreover, we demonstrate
that CoMPILE can naturally handle asymmetric/anti-symmetric relations without
the need for explosively increasing the number of model parameters by
extracting the directed enclosing subgraphs. Extensive experiments show
substantial performance gains in comparison to state-of-the-art methods on
commonly used benchmark datasets with variant inductive settings.Comment: Accepted by AAAI-202
Iteratively Learning Representations for Unseen Entities with Inter-Rule Correlations
Recent work on knowledge graph completion (KGC) focused on learning
embeddings of entities and relations in knowledge graphs. These embedding
methods require that all test entities are observed at training time, resulting
in a time-consuming retraining process for out-of-knowledge-graph (OOKG)
entities. To address this issue, current inductive knowledge embedding methods
employ graph neural networks (GNNs) to represent unseen entities by aggregating
information of known neighbors. They face three important challenges: (i) data
sparsity, (ii) the presence of complex patterns in knowledge graphs (e.g.,
inter-rule correlations), and (iii) the presence of interactions among rule
mining, rule inference, and embedding. In this paper, we propose a virtual
neighbor network with inter-rule correlations (VNC) that consists of three
stages: (i) rule mining, (ii) rule inference, and (iii) embedding. In the rule
mining process, to identify complex patterns in knowledge graphs, both logic
rules and inter-rule correlations are extracted from knowledge graphs based on
operations over relation embeddings. To reduce data sparsity, virtual neighbors
for OOKG entities are predicted and assigned soft labels by optimizing a
rule-constrained problem. We also devise an iterative framework to capture the
underlying relations between rule learning and embedding learning. In our
experiments, results on both link prediction and triple classification tasks
show that the proposed VNC framework achieves state-of-the-art performance on
four widely-used knowledge graphs. Further analysis reveals that VNC is robust
to the proportion of unseen entities and effectively mitigates data sparsity.Comment: Accepted at CIKM 202
- …