2 research outputs found
Global Textual Relation Embedding for Relational Understanding
Pre-trained embeddings such as word embeddings and sentence embeddings are
fundamental tools facilitating a wide range of downstream NLP tasks. In this
work, we investigate how to learn a general-purpose embedding of textual
relations, defined as the shortest dependency path between entities. Textual
relation embedding provides a level of knowledge between word/phrase level and
sentence level, and we show that it can facilitate downstream tasks requiring
relational understanding of the text. To learn such an embedding, we create the
largest distant supervision dataset by linking the entire English ClueWeb09
corpus to Freebase. We use global co-occurrence statistics between textual and
knowledge base relations as the supervision signal to train the embedding.
Evaluation on two relational understanding tasks demonstrates the usefulness of
the learned textual relation embedding. The data and code can be found at
https://github.com/czyssrs/GloREPlusComment: Accepted to ACL 2019. 5 pages, 2 figure
Improving Distant Supervised Relation Extraction by Dynamic Neural Network
Distant Supervised Relation Extraction (DSRE) is usually formulated as a
problem of classifying a bag of sentences that contain two query entities, into
the predefined relation classes. Most existing methods consider those relation
classes as distinct semantic categories while ignoring their potential
connection to query entities. In this paper, we propose to leverage this
connection to improve the relation extraction accuracy. Our key ideas are
twofold: (1) For sentences belonging to the same relation class, the expression
style, i.e. words choice, can vary according to the query entities. To account
for this style shift, the model should adjust its parameters in accordance with
entity types. (2) Some relation classes are semantically similar, and the
entity types appear in one relation may also appear in others. Therefore, it
can be trained cross different relation classes and further enhance those
classes with few samples, i.e., long-tail classes. To unify these two
arguments, we developed a novel Dynamic Neural Network for Relation Extraction
(DNNRE). The network adopts a novel dynamic parameter generator that
dynamically generates the network parameters according to the query entity
types and relation classes. By using this mechanism, the network can
simultaneously handle the style shift problem and enhance the prediction
accuracy for long-tail classes. Through our experimental study, we demonstrate
the effectiveness of the proposed method and show that it can achieve superior
performance over the state-of-the-art methods.Comment: 29 pages, 8 figure