798,709 research outputs found
Revisiting Unsupervised Relation Extraction
Unsupervised relation extraction (URE) extracts relations between named
entities from raw text without manually-labelled data and existing knowledge
bases (KBs). URE methods can be categorised into generative and discriminative
approaches, which rely either on hand-crafted features or surface form.
However, we demonstrate that by using only named entities to induce relation
types, we can outperform existing methods on two popular datasets. We conduct a
comparison and evaluation of our findings with other URE techniques, to
ascertain the important features in URE. We conclude that entity types provide
a strong inductive bias for URE.Comment: 8 pages, 1 figure, 2 tables. Accepted in ACL 202
Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction
Distant supervision leverages knowledge bases to automatically label
instances, thus allowing us to train relation extractor without human
annotations. However, the generated training data typically contain massive
noise, and may result in poor performances with the vanilla supervised
learning. In this paper, we propose to conduct multi-instance learning with a
novel Cross-relation Cross-bag Selective Attention (CSA), which leads to
noise-robust training for distant supervised relation extractor. Specifically,
we employ the sentence-level selective attention to reduce the effect of noisy
or mismatched sentences, while the correlation among relations were captured to
improve the quality of attention weights. Moreover, instead of treating all
entity-pairs equally, we try to pay more attention to entity-pairs with a
higher quality. Similarly, we adopt the selective attention mechanism to
achieve this goal. Experiments with two types of relation extractor demonstrate
the superiority of the proposed approach over the state-of-the-art, while
further ablation studies verify our intuitions and demonstrate the
effectiveness of our proposed two techniques.Comment: AAAI 201
- …