The task of completing knowledge triplets has broad downstream applications.
Both structural and semantic information plays an important role in knowledge
graph completion. Unlike previous approaches that rely on either the structures
or semantics of the knowledge graphs, we propose to jointly embed the semantics
in the natural language description of the knowledge triplets with their
structure information. Our method embeds knowledge graphs for the completion
task via fine-tuning pre-trained language models with respect to a
probabilistic structured loss, where the forward pass of the language models
captures semantics and the loss reconstructs structures. Our extensive
experiments on a variety of knowledge graph benchmarks have demonstrated the
state-of-the-art performance of our method. We also show that our method can
significantly improve the performance in a low-resource regime, thanks to the
better use of semantics. The code and datasets are available at
https://github.com/pkusjh/LASS.Comment: COLING 202