4 research outputs found
Concurrent Parsing of Constituency and Dependency
Constituent and dependency representation for syntactic structure share a lot
of linguistic and computational characteristics, this paper thus makes the
first attempt by introducing a new model that is capable of parsing constituent
and dependency at the same time, so that lets either of the parsers enhance
each other. Especially, we evaluate the effect of different shared network
components and empirically verify that dependency parsing may be much more
beneficial from constituent parsing structure.
The proposed parser achieves new state-of-the-art performance for both
parsing tasks, constituent and dependency on PTB and CTB benchmarks.Comment: arXiv admin note: text overlap with arXiv:1907.0268
Head-Driven Phrase Structure Grammar Parsing on Penn Treebank
Head-driven phrase structure grammar (HPSG) enjoys a uniform formalism
representing rich contextual syntactic and even semantic meanings. This paper
makes the first attempt to formulate a simplified HPSG by integrating
constituent and dependency formal representations into head-driven phrase
structure. Then two parsing algorithms are respectively proposed for two
converted tree representations, division span and joint span. As HPSG encodes
both constituent and dependency structure information, the proposed HPSG
parsers may be regarded as a sort of joint decoder for both types of structures
and thus are evaluated in terms of extracted or converted constituent and
dependency parsing trees. Our parser achieves new state-of-the-art performance
for both parsing tasks on Penn Treebank (PTB) and Chinese Penn Treebank,
verifying the effectiveness of joint learning constituent and dependency
structures. In details, we report 96.33 F1 of constituent parsing and 97.20\%
UAS of dependency parsing on PTB.Comment: Accepted by ACL 201
Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP
Syntax has been shown useful for various NLP tasks, while existing work
mostly encodes singleton syntactic tree using one hierarchical neural network.
In this paper, we investigate a simple and effective method, Knowledge
Distillation, to integrate heterogeneous structure knowledge into a unified
sequential LSTM encoder. Experimental results on four typical syntax-dependent
tasks show that our method outperforms tree encoders by effectively integrating
rich heterogeneous structure syntax, meanwhile reducing error propagation, and
also outperforms ensemble methods, in terms of both the efficiency and
accuracy.Comment: To appear at EMNLP202
Features for Phrase-Structure Reranking from Dependency Parses
Radically different approaches have been proved to be effective for phrase-structure and dependency parsers in the last decade. Here, we aim to exploit the divergence in these approaches and show the utility of features extracted from the automatic dependency parses of sentences for a discriminative phrase-structure parser. Our experiments show a significant improvement over the state-of-the-art German discriminative constituent parser