1,203 research outputs found
Diachronic Embedding for Temporal Knowledge Graph Completion
Knowledge graphs (KGs) typically contain temporal facts indicating
relationships among entities at different times. Due to their incompleteness,
several approaches have been proposed to infer new facts for a KG based on the
existing ones-a problem known as KG completion. KG embedding approaches have
proved effective for KG completion, however, they have been developed mostly
for static KGs. Developing temporal KG embedding models is an increasingly
important problem. In this paper, we build novel models for temporal KG
completion through equipping static models with a diachronic entity embedding
function which provides the characteristics of entities at any point in time.
This is in contrast to the existing temporal KG embedding approaches where only
static entity features are provided. The proposed embedding function is
model-agnostic and can be potentially combined with any static model. We prove
that combining it with SimplE, a recent model for static KG embedding, results
in a fully expressive model for temporal KG completion. Our experiments
indicate the superiority of our proposal compared to existing baselines
Towards A Question Answering System over Temporal Knowledge Graph Embedding
Question Answering (QA) over knowledge graphs is a vital topic within information retrieval. Questions with temporal intent are a special case of questions for QA systems that have received only limited attention so far. In this paper, we study using temporal knowledge graph embeddings (TKGEs) for temporal QA. Firstly, we propose a microservice-based architecture for building temporal QA systems on pre-trained TKGE models. Secondly, we present a Bayesian model average (BMA) ensemble method, where results of several link prediction tasks on separated TKGE models are combined to find better answers. Within the system built using the microservice-based architecture, the experiments on two benchmark datasets show that BMA provides better results than the individual models.</p
Temporal Knowledge Graph Completion: A Survey
Knowledge graph completion (KGC) can predict missing links and is crucial for
real-world knowledge graphs, which widely suffer from incompleteness. KGC
methods assume a knowledge graph is static, but that may lead to inaccurate
prediction results because many facts in the knowledge graphs change over time.
Recently, emerging methods have shown improved predictive results by further
incorporating the timestamps of facts; namely, temporal knowledge graph
completion (TKGC). With this temporal information, TKGC methods can learn the
dynamic evolution of the knowledge graph that KGC methods fail to capture. In
this paper, for the first time, we summarize the recent advances in TKGC
research. First, we detail the background of TKGC, including the problem
definition, benchmark datasets, and evaluation metrics. Then, we summarize
existing TKGC methods based on how timestamps of facts are used to capture the
temporal dynamics. Finally, we conclude the paper and present future research
directions of TKGC
ChronoR: Rotation Based Temporal Knowledge Graph Embedding
Despite the importance and abundance of temporal knowledge graphs, most of
the current research has been focused on reasoning on static graphs. In this
paper, we study the challenging problem of inference over temporal knowledge
graphs. In particular, the task of temporal link prediction. In general, this
is a difficult task due to data non-stationarity, data heterogeneity, and its
complex temporal dependencies. We propose Chronological Rotation embedding
(ChronoR), a novel model for learning representations for entities, relations,
and time. Learning dense representations is frequently used as an efficient and
versatile method to perform reasoning on knowledge graphs. The proposed model
learns a k-dimensional rotation transformation parametrized by relation and
time, such that after each fact's head entity is transformed using the
rotation, it falls near its corresponding tail entity. By using high
dimensional rotation as its transformation operator, ChronoR captures rich
interaction between the temporal and multi-relational characteristics of a
Temporal Knowledge Graph. Experimentally, we show that ChronoR is able to
outperform many of the state-of-the-art methods on the benchmark datasets for
temporal knowledge graph link prediction
A Survey on Temporal Knowledge Graph Completion: Taxonomy, Progress, and Prospects
Temporal characteristics are prominently evident in a substantial volume of
knowledge, which underscores the pivotal role of Temporal Knowledge Graphs
(TKGs) in both academia and industry. However, TKGs often suffer from
incompleteness for three main reasons: the continuous emergence of new
knowledge, the weakness of the algorithm for extracting structured information
from unstructured data, and the lack of information in the source dataset.
Thus, the task of Temporal Knowledge Graph Completion (TKGC) has attracted
increasing attention, aiming to predict missing items based on the available
information. In this paper, we provide a comprehensive review of TKGC methods
and their details. Specifically, this paper mainly consists of three
components, namely, 1)Background, which covers the preliminaries of TKGC
methods, loss functions required for training, as well as the dataset and
evaluation protocol; 2)Interpolation, that estimates and predicts the missing
elements or set of elements through the relevant available information. It
further categorizes related TKGC methods based on how to process temporal
information; 3)Extrapolation, which typically focuses on continuous TKGs and
predicts future events, and then classifies all extrapolation methods based on
the algorithms they utilize. We further pinpoint the challenges and discuss
future research directions of TKGC
Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion
Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting
aims to predict the missing entity from a fact in the future, posing a
challenge that aligns more closely with real-world prediction problems.
Existing research mostly encodes entities and relations using sequential graph
neural networks applied to recent snapshots. However, these approaches tend to
overlook the ability to skip irrelevant snapshots according to entity-related
relations in the query and disregard the importance of explicit temporal
information. To address this, we propose our model, Re-Temp (Relation-Aware
Temporal Representation Learning), which leverages explicit temporal embedding
as input and incorporates skip information flow after each timestamp to skip
unnecessary information for prediction. Additionally, we introduce a two-phase
forward propagation method to prevent information leakage. Through the
evaluation on six TKGC (extrapolation) datasets, we demonstrate that our model
outperforms all eight recent state-of-the-art models by a significant margin.Comment: Findings of EMNLP 202
- …