275 research outputs found
Semantic Graph Parsing with Recurrent Neural Network DAG Grammars
Semantic parses are directed acyclic graphs (DAGs), so semantic parsing
should be modeled as graph prediction. But predicting graphs presents difficult
technical challenges, so it is simpler and more common to predict the
linearized graphs found in semantic parsing datasets using well-understood
sequence models. The cost of this simplicity is that the predicted strings may
not be well-formed graphs. We present recurrent neural network DAG grammars, a
graph-aware sequence model that ensures only well-formed graphs while
sidestepping many difficulties in graph prediction. We test our model on the
Parallel Meaning Bank---a multilingual semantic graphbank. Our approach yields
competitive results in English and establishes the first results for German,
Italian and Dutch.Comment: 9 pages, to appear in EMNLP201
Neural Combinatory Constituency Parsing
東京都立大学Tokyo Metropolitan University博士(情報科学)doctoral thesi
Polyglot Semantic Parsing in APIs
Traditional approaches to semantic parsing (SP) work by training individual
models for each available parallel dataset of text-meaning pairs. In this
paper, we explore the idea of polyglot semantic translation, or learning
semantic parsing models that are trained on multiple datasets and natural
languages. In particular, we focus on translating text to code signature
representations using the software component datasets of Richardson and Kuhn
(2017a,b). The advantage of such models is that they can be used for parsing a
wide variety of input natural languages and output programming languages, or
mixed input languages, using a single unified model. To facilitate modeling of
this type, we develop a novel graph-based decoding framework that achieves
state-of-the-art performance on the above datasets, and apply this method to
two other benchmark SP tasks.Comment: accepted for NAACL-2018 (camera ready version
A Systematic Survey on Deep Generative Models for Graph Generation
Graphs are important data representations for describing objects and their
relationships, which appear in a wide diversity of real-world scenarios. As one
of a critical problem in this area, graph generation considers learning the
distributions of given graphs and generating more novel graphs. Owing to its
wide range of applications, generative models for graphs have a rich history,
which, however, are traditionally hand-crafted and only capable of modeling a
few statistical properties of graphs. Recent advances in deep generative models
for graph generation is an important step towards improving the fidelity of
generated graphs and paves the way for new kinds of applications. This article
provides an extensive overview of the literature in the field of deep
generative models for the graph generation. Firstly, the formal definition of
deep generative models for the graph generation as well as preliminary
knowledge is provided. Secondly, two taxonomies of deep generative models for
unconditional, and conditional graph generation respectively are proposed; the
existing works of each are compared and analyzed. After that, an overview of
the evaluation metrics in this specific domain is provided. Finally, the
applications that deep graph generation enables are summarized and five
promising future research directions are highlighted
- …