24,918 research outputs found
Multi-Relational Hyperbolic Word Embeddings from Natural Language Definitions
Natural language definitions possess a recursive, self-explanatory semantic
structure that can support representation learning methods able to preserve
explicit conceptual relations and constraints in the latent space. This paper
presents a multi-relational model that explicitly leverages such a structure to
derive word embeddings from definitions. By automatically extracting the
relations linking defined and defining terms from dictionaries, we demonstrate
how the problem of learning word embeddings can be formalised via a
translational framework in Hyperbolic space and used as a proxy to capture the
global semantic structure of definitions. An extensive empirical analysis
demonstrates that the framework can help imposing the desired structural
constraints while preserving the semantic mapping required for controllable and
interpretable traversal. Moreover, the experiments reveal the superiority of
the Hyperbolic word embeddings over the Euclidean counterparts and demonstrate
that the multi-relational approach can obtain competitive results when compared
to state-of-the-art neural models, with the advantage of being intrinsically
more efficient and interpretable.Comment: Accepted at the 18th Conference of the European Chapter of the
Association for Computational Linguistics (EACL 2024), camera-read
Multi-task Neural Network for Non-discrete Attribute Prediction in Knowledge Graphs
Many popular knowledge graphs such as Freebase, YAGO or DBPedia maintain a
list of non-discrete attributes for each entity. Intuitively, these attributes
such as height, price or population count are able to richly characterize
entities in knowledge graphs. This additional source of information may help to
alleviate the inherent sparsity and incompleteness problem that are prevalent
in knowledge graphs. Unfortunately, many state-of-the-art relational learning
models ignore this information due to the challenging nature of dealing with
non-discrete data types in the inherently binary-natured knowledge graphs. In
this paper, we propose a novel multi-task neural network approach for both
encoding and prediction of non-discrete attribute information in a relational
setting. Specifically, we train a neural network for triplet prediction along
with a separate network for attribute value regression. Via multi-task
learning, we are able to learn representations of entities, relations and
attributes that encode information about both tasks. Moreover, such attributes
are not only central to many predictive tasks as an information source but also
as a prediction target. Therefore, models that are able to encode, incorporate
and predict such information in a relational learning context are highly
attractive as well. We show that our approach outperforms many state-of-the-art
methods for the tasks of relational triplet classification and attribute value
prediction.Comment: Accepted at CIKM 201
- …