367 research outputs found
Multi-Grained Named Entity Recognition
This paper presents a novel framework, MGNER, for Multi-Grained Named Entity
Recognition where multiple entities or entity mentions in a sentence could be
non-overlapping or totally nested. Different from traditional approaches
regarding NER as a sequential labeling task and annotate entities
consecutively, MGNER detects and recognizes entities on multiple granularities:
it is able to recognize named entities without explicitly assuming
non-overlapping or totally nested structures. MGNER consists of a Detector that
examines all possible word segments and a Classifier that categorizes entities.
In addition, contextual information and a self-attention mechanism are utilized
throughout the framework to improve the NER performance. Experimental results
show that MGNER outperforms current state-of-the-art baselines up to 4.4% in
terms of the F1 score among nested/non-overlapping NER tasks.Comment: In ACL 2019 as a long pape
Bipartite Flat-Graph Network for Nested Named Entity Recognition
In this paper, we propose a novel bipartite flat-graph network (BiFlaG) for
nested named entity recognition (NER), which contains two subgraph modules: a
flat NER module for outermost entities and a graph module for all the entities
located in inner layers. Bidirectional LSTM (BiLSTM) and graph convolutional
network (GCN) are adopted to jointly learn flat entities and their inner
dependencies. Different from previous models, which only consider the
unidirectional delivery of information from innermost layers to outer ones (or
outside-to-inside), our model effectively captures the bidirectional
interaction between them. We first use the entities recognized by the flat NER
module to construct an entity graph, which is fed to the next graph module. The
richer representation learned from graph module carries the dependencies of
inner entities and can be exploited to improve outermost entity predictions.
Experimental results on three standard nested NER datasets demonstrate that our
BiFlaG outperforms previous state-of-the-art models.Comment: Accepted by ACL202
- …