1 research outputs found
MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering
Knowledge Graphs (KGs) are symbolically structured storages of facts. The KG
embedding contains concise data used in NLP tasks requiring implicit
information about the real world. Furthermore, the size of KGs that may be
useful in actual NLP assignments is enormous, and creating embedding over it
has memory cost issues. We represent KG as a 3rd-order binary tensor and move
beyond the standard CP decomposition by using a data-specific generalized
version of it. The generalization of the standard CP-ALS algorithm allows
obtaining optimization gradients without a backpropagation mechanism. It
reduces the memory needed in training while providing computational benefits.
We propose a MEKER, a memory-efficient KG embedding model, which yields
SOTA-comparable performance on link prediction tasks and KG-based Question
Answering