32 research outputs found
Multi-Grained Named Entity Recognition
This paper presents a novel framework, MGNER, for Multi-Grained Named Entity
Recognition where multiple entities or entity mentions in a sentence could be
non-overlapping or totally nested. Different from traditional approaches
regarding NER as a sequential labeling task and annotate entities
consecutively, MGNER detects and recognizes entities on multiple granularities:
it is able to recognize named entities without explicitly assuming
non-overlapping or totally nested structures. MGNER consists of a Detector that
examines all possible word segments and a Classifier that categorizes entities.
In addition, contextual information and a self-attention mechanism are utilized
throughout the framework to improve the NER performance. Experimental results
show that MGNER outperforms current state-of-the-art baselines up to 4.4% in
terms of the F1 score among nested/non-overlapping NER tasks.Comment: In ACL 2019 as a long pape
Bipartite Flat-Graph Network for Nested Named Entity Recognition
In this paper, we propose a novel bipartite flat-graph network (BiFlaG) for
nested named entity recognition (NER), which contains two subgraph modules: a
flat NER module for outermost entities and a graph module for all the entities
located in inner layers. Bidirectional LSTM (BiLSTM) and graph convolutional
network (GCN) are adopted to jointly learn flat entities and their inner
dependencies. Different from previous models, which only consider the
unidirectional delivery of information from innermost layers to outer ones (or
outside-to-inside), our model effectively captures the bidirectional
interaction between them. We first use the entities recognized by the flat NER
module to construct an entity graph, which is fed to the next graph module. The
richer representation learned from graph module carries the dependencies of
inner entities and can be exploited to improve outermost entity predictions.
Experimental results on three standard nested NER datasets demonstrate that our
BiFlaG outperforms previous state-of-the-art models.Comment: Accepted by ACL202
Named Entity Recognition as Dependency Parsing
Named Entity Recognition (NER) is a fundamental task in Natural Language Processing, concerned with identifying spans of text expressing references to entities. NER research is often focused on flat entities only (flat NER), ignoring the fact that entity references can be nested, as in [Bank of [China]] (Finkel and Manning, 2009). In this paper, we use ideas from graph-based dependency parsing to provide our model a global view on the input via a biaffine model (Dozat and Manning, 2017). The biaffine model scores pairs of start and end tokens in a sentence which we use to explore all spans, so that the model is able to predict named entities accurately. We show that the model works well for both nested and flat NER through evaluation on 8 corpora and achieving SoTA performance on all of them, with accuracy gains of up to 2.2 percentage points