2 research outputs found
Syntax-aware Neural Semantic Role Labeling with Supertags
We introduce a new syntax-aware model for dependency-based semantic role
labeling that outperforms syntax-agnostic models for English and Spanish. We
use a BiLSTM to tag the text with supertags extracted from dependency parses,
and we feed these supertags, along with words and parts of speech, into a deep
highway BiLSTM for semantic role labeling. Our model combines the strengths of
earlier models that performed SRL on the basis of a full dependency parse with
more recent models that use no syntactic information at all. Our local and
non-ensemble model achieves state-of-the-art performance on the CoNLL 09
English and Spanish datasets. SRL models benefit from syntactic information,
and we show that supertagging is a simple, powerful, and robust way to
incorporate syntax into a neural SRL system.Comment: NAACL 2019, Added Spanish ELMo result
End-to-end Graph-based TAG Parsing with Neural Networks
We present a graph-based Tree Adjoining Grammar (TAG) parser that uses
BiLSTMs, highway connections, and character-level CNNs. Our best end-to-end
parser, which jointly performs supertagging, POS tagging, and parsing,
outperforms the previously reported best results by more than 2.2 LAS and UAS
points. The graph-based parsing architecture allows for global inference and
rich feature representations for TAG parsing, alleviating the fundamental
trade-off between transition-based and graph-based parsing systems. We also
demonstrate that the proposed parser achieves state-of-the-art performance in
the downstream tasks of Parsing Evaluation using Textual Entailments (PETE) and
Unbounded Dependency Recovery. This provides further support for the claim that
TAG is a viable formalism for problems that require rich structural analysis of
sentences.Comment: NAACL 201