212 research outputs found
Synchronous Context-Free Grammars and Optimal Linear Parsing Strategies
Synchronous Context-Free Grammars (SCFGs), also known as syntax-directed
translation schemata, are unlike context-free grammars in that they do not have
a binary normal form. In general, parsing with SCFGs takes space and time
polynomial in the length of the input strings, but with the degree of the
polynomial depending on the permutations of the SCFG rules. We consider linear
parsing strategies, which add one nonterminal at a time. We show that for a
given input permutation, the problems of finding the linear parsing strategy
with the minimum space and time complexity are both NP-hard
Strictly Breadth-First AMR Parsing
AMR parsing is the task that maps a sentence to an AMR semantic graph
automatically. We focus on the breadth-first strategy of this task, which was
proposed recently and achieved better performance than other strategies.
However, current models under this strategy only \emph{encourage} the model to
produce the AMR graph in breadth-first order, but \emph{cannot guarantee} this.
To solve this problem, we propose a new architecture that \emph{guarantees}
that the parsing will strictly follow the breadth-first order. In each parsing
step, we introduce a \textbf{focused parent} vertex and use this vertex to
guide the generation. With the help of this new architecture and some other
improvements in the sentence and graph encoder, our model obtains better
performance on both the AMR 1.0 and 2.0 dataset
- …