Knowledge graph embedding (KGE) focuses on representing the entities and
relations of a knowledge graph (KG) into the continuous vector spaces, which
can be employed to predict the missing triples to achieve knowledge graph
completion (KGC). However, KGE models often only briefly learn structural
correlations of triple data and embeddings would be misled by the trivial
patterns and noisy links in real-world KGs. To address this issue, we build the
new paradigm of KGE in the context of causality and embedding disentanglement.
We further propose a Causality-enhanced knowledge graph Embedding (CausE)
framework. CausE employs causal intervention to estimate the causal effect of
the confounder embeddings and design new training objectives to make stable
predictions. Experimental results demonstrate that CausE could outperform the
baseline models and achieve state-of-the-art KGC performance. We release our
code in https://github.com/zjukg/CausE.Comment: Accepted by CCKS 2023 as a research pape