640 research outputs found
End-to-end Neural Coreference Resolution
We introduce the first end-to-end coreference resolution model and show that
it significantly outperforms all previous work without using a syntactic parser
or hand-engineered mention detector. The key idea is to directly consider all
spans in a document as potential mentions and learn distributions over possible
antecedents for each. The model computes span embeddings that combine
context-dependent boundary representations with a head-finding attention
mechanism. It is trained to maximize the marginal likelihood of gold antecedent
spans from coreference clusters and is factored to enable aggressive pruning of
potential mentions. Experiments demonstrate state-of-the-art performance, with
a gain of 1.5 F1 on the OntoNotes benchmark and by 3.1 F1 using a 5-model
ensemble, despite the fact that this is the first approach to be successfully
trained with no external resources.Comment: Accepted to EMNLP 201
- …