Matching Words and Knowledge Graph Entities with Meta-Embeddings

Abstract

International audienceWord vector are a key comp onent for matching distinct textual units semantically. However, they are not directly applicable for matching text with tructured data, for whi ch graph embeddings exist. In this work, we propose a flexible method in order to map the representation of graph embedding to word embeddings representation. Thus, we can improve word embeddings with a weighted average with mapped graph embeddings.We evaluate our models on the task of matching natural language questions and SPARQL queries, and significantly improve queries matching accuracy. We also evaluate word meta-embeddings intrinsically and show improvements over previous models

    Similar works