4,903 research outputs found
CNM: An Interpretable Complex-valued Network for Matching
This paper seeks to model human language by the mathematical framework of
quantum physics. With the well-designed mathematical formulations in quantum
physics, this framework unifies different linguistic units in a single
complex-valued vector space, e.g. words as particles in quantum states and
sentences as mixed systems. A complex-valued network is built to implement this
framework for semantic matching. With well-constrained complex-valued
components, the network admits interpretations to explicit physical meanings.
The proposed complex-valued network for matching (CNM) achieves comparable
performances to strong CNN and RNN baselines on two benchmarking question
answering (QA) datasets
Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems
Natural language generation (NLG) is a critical component of spoken dialogue
and it has a significant impact both on usability and perceived quality. Most
NLG systems in common use employ rules and heuristics and tend to generate
rigid and stylised responses without the natural variation of human language.
They are also not easily scaled to systems covering multiple domains and
languages. This paper presents a statistical language generator based on a
semantically controlled Long Short-term Memory (LSTM) structure. The LSTM
generator can learn from unaligned data by jointly optimising sentence planning
and surface realisation using a simple cross entropy training criterion, and
language variation can be easily achieved by sampling from output candidates.
With fewer heuristics, an objective evaluation in two differing test domains
showed the proposed method improved performance compared to previous methods.
Human judges scored the LSTM system higher on informativeness and naturalness
and overall preferred it to the other systems.Comment: To be appear in EMNLP 201
- …