513 research outputs found
Exact Inference for Generative Probabilistic Non-Projective Dependency Parsing
We describe a generative model for nonprojective dependency parsing based on a simplified version of a transition system that has recently appeared in the literature. We then develop a dynamic programming parsing algorithm for our model, and derive an insideoutside algorithm that can be used for unsupervised learning of non-projective dependency trees
Elimination of Spurious Ambiguity in Transition-Based Dependency Parsing
We present a novel technique to remove spurious ambiguity from transition
systems for dependency parsing. Our technique chooses a canonical sequence of
transition operations (computation) for a given dependency tree. Our technique
can be applied to a large class of bottom-up transition systems, including for
instance Nivre (2004) and Attardi (2006)
The CoNLL 2007 shared task on dependency parsing
The Conference on Computational Natural Language Learning features a shared task, in which participants train and test their learning systems on the same data sets. In 2007, as in 2006, the shared task has been devoted to dependency parsing, this year with both a multilingual track and a domain adaptation track. In this paper, we define the tasks of the different tracks and describe how the data sets were created from existing treebanks for ten languages. In addition, we characterize the different approaches of the participating systems, report the test results, and provide a first analysis of these results
Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder
Human annotation for syntactic parsing is expensive, and large resources are
available only for a fraction of languages. A question we ask is whether one
can leverage abundant unlabeled texts to improve syntactic parsers, beyond just
using the texts to obtain more generalisable lexical features (i.e. beyond word
embeddings). To this end, we propose a novel latent-variable generative model
for semi-supervised syntactic dependency parsing. As exact inference is
intractable, we introduce a differentiable relaxation to obtain approximate
samples and compute gradients with respect to the parser parameters. Our method
(Differentiable Perturb-and-Parse) relies on differentiable dynamic programming
over stochastically perturbed edge scores. We demonstrate effectiveness of our
approach with experiments on English, French and Swedish.Comment: Accepted at ICLR 201
- …