27 research outputs found
Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers
Most approaches to extraction multiple relations from a paragraph require
multiple passes over the paragraph. In practice, multiple passes are
computationally expensive and this makes difficult to scale to longer
paragraphs and larger text corpora. In this work, we focus on the task of
multiple relation extraction by encoding the paragraph only once (one-pass). We
build our solution on the pre-trained self-attentive (Transformer) models,
where we first add a structured prediction layer to handle extraction between
multiple entity pairs, then enhance the paragraph embedding to capture multiple
relational information associated with each entity with an entity-aware
attention technique. We show that our approach is not only scalable but can
also perform state-of-the-art on the standard benchmark ACE 2005.Comment: 7 page
The ADAPT Enhanced Dependency Parser at the IWPT 2020 Shared Task
We describe the ADAPT system for the 2020 IWPT Shared Task on parsing
enhanced Universal Dependencies in 17 languages. We implement a pipeline
approach using UDPipe and UDPipe-future to provide initial levels of
annotation. The enhanced dependency graph is either produced by a graph-based
semantic dependency parser or is built from the basic tree using a small set of
heuristics. Our results show that, for the majority of languages, a semantic
dependency parser can be successfully applied to the task of parsing enhanced
dependencies.
Unfortunately, we did not ensure a connected graph as part of our pipeline
approach and our competition submission relied on a last-minute fix to pass the
validation script which harmed our official evaluation scores significantly.
Our submission ranked eighth in the official evaluation with a macro-averaged
coarse ELAS F1 of 67.23 and a treebank average of 67.49. We later implemented
our own graph-connecting fix which resulted in a score of 79.53 (language
average) or 79.76 (treebank average), which would have placed fourth in the
competition evaluation.Comment: Submitted to the 2020 IWPT shared task on parsing Enhanced Universal
Dependencie