4,315 research outputs found
Multi-hop Evidence Retrieval for Cross-document Relation Extraction
Relation Extraction (RE) has been extended to cross-document scenarios
because many relations are not simply described in a single document. This
inevitably brings the challenge of efficient open-space evidence retrieval to
support the inference of cross-document relations, along with the challenge of
multi-hop reasoning on top of entities and evidence scattered in an open set of
documents. To combat these challenges, we propose MR.COD (Multi-hop evidence
retrieval for Cross-document relation extraction), which is a multi-hop
evidence retrieval method based on evidence path mining and ranking. We explore
multiple variants of retrievers to show evidence retrieval is essential in
cross-document RE. We also propose a contextual dense retriever for this
setting. Experiments on CodRED show that evidence retrieval with MR.COD
effectively acquires crossdocument evidence and boosts end-to-end RE
performance in both closed and open settings.Comment: ACL 2023 (Findings
A Relational Triple Extraction Method Based on Feature Reasoning for Technological Patents
The relation triples extraction method based on table filling can address the
issues of relation overlap and bias propagation. However, most of them only
establish separate table features for each relationship, which ignores the
implicit relationship between different entity pairs and different relationship
features. Therefore, a feature reasoning relational triple extraction method
based on table filling for technological patents is proposed to explore the
integration of entity recognition and entity relationship, and to extract
entity relationship triples from multi-source scientific and technological
patents data. Compared with the previous methods, the method we proposed for
relational triple extraction has the following advantages: 1) The table filling
method that saves more running space enhances the speed and efficiency of the
model. 2) Based on the features of existing token pairs and table relations,
reasoning the implicit relationship features, and improve the accuracy of
triple extraction. On five benchmark datasets, we evaluated the model we
suggested. The result suggest that our model is advanced and effective, and it
performed well on most of these datasets
DWIE: an entity-centric dataset for multi-task document-level information extraction
This paper presents DWIE, the 'Deutsche Welle corpus for Information
Extraction', a newly created multi-task dataset that combines four main
Information Extraction (IE) annotation subtasks: (i) Named Entity Recognition
(NER), (ii) Coreference Resolution, (iii) Relation Extraction (RE), and (iv)
Entity Linking. DWIE is conceived as an entity-centric dataset that describes
interactions and properties of conceptual entities on the level of the complete
document. This contrasts with currently dominant mention-driven approaches that
start from the detection and classification of named entity mentions in
individual sentences. Further, DWIE presented two main challenges when building
and evaluating IE models for it. First, the use of traditional mention-level
evaluation metrics for NER and RE tasks on entity-centric DWIE dataset can
result in measurements dominated by predictions on more frequently mentioned
entities. We tackle this issue by proposing a new entity-driven metric that
takes into account the number of mentions that compose each of the predicted
and ground truth entities. Second, the document-level multi-task annotations
require the models to transfer information between entity mentions located in
different parts of the document, as well as between different tasks, in a joint
learning setting. To realize this, we propose to use graph-based neural message
passing techniques between document-level mention spans. Our experiments show
an improvement of up to 5.5 F1 percentage points when incorporating neural
graph propagation into our joint model. This demonstrates DWIE's potential to
stimulate further research in graph neural networks for representation learning
in multi-task IE. We make DWIE publicly available at
https://github.com/klimzaporojets/DWIE
- …