23,341 research outputs found
Recommended from our members
End-to-End Quantum-like Language Models with Application to Question Answering
Language Modeling (LM) is a fundamental research topic ina range of areas. Recently, inspired by quantum theory, a novel Quantum Language Model (QLM) has been proposed for Information Retrieval (IR). In this paper, we aim to broaden the theoretical and practical basis of QLM. We develop a Neural Network based Quantum-like Language Model (NNQLM) and apply it to Question Answering. Specifically, based on word embeddings, we design a new density matrix, which represents a sentence (e.g., a question or an answer) and encodes a mixture of semantic subspaces. Such a density matrix, together with a joint representation of the question and the answer, can be integrated into neural network architectures (e.g., 2-dimensional convolutional neural networks). Experiments on the TREC-QA and WIKIQA datasets have verified the effectiveness of our proposed models
CNM: An Interpretable Complex-valued Network for Matching
This paper seeks to model human language by the mathematical framework of
quantum physics. With the well-designed mathematical formulations in quantum
physics, this framework unifies different linguistic units in a single
complex-valued vector space, e.g. words as particles in quantum states and
sentences as mixed systems. A complex-valued network is built to implement this
framework for semantic matching. With well-constrained complex-valued
components, the network admits interpretations to explicit physical meanings.
The proposed complex-valued network for matching (CNM) achieves comparable
performances to strong CNN and RNN baselines on two benchmarking question
answering (QA) datasets
A Quantum Many-body Wave Function Inspired Language Modeling Approach
The recently proposed quantum language model (QLM) aimed at a principled
approach to modeling term dependency by applying the quantum probability
theory. The latest development for a more effective QLM has adopted word
embeddings as a kind of global dependency information and integrated the
quantum-inspired idea in a neural network architecture. While these
quantum-inspired LMs are theoretically more general and also practically
effective, they have two major limitations. First, they have not taken into
account the interaction among words with multiple meanings, which is common and
important in understanding natural language text. Second, the integration of
the quantum-inspired LM with the neural network was mainly for effective
training of parameters, yet lacking a theoretical foundation accounting for
such integration. To address these two issues, in this paper, we propose a
Quantum Many-body Wave Function (QMWF) inspired language modeling approach. The
QMWF inspired LM can adopt the tensor product to model the aforesaid
interaction among words. It also enables us to reveal the inherent necessity of
using Convolutional Neural Network (CNN) in QMWF language modeling.
Furthermore, our approach delivers a simple algorithm to represent and match
text/sentence pairs. Systematic evaluation shows the effectiveness of the
proposed QMWF-LM algorithm, in comparison with the state of the art
quantum-inspired LMs and a couple of CNN-based methods, on three typical
Question Answering (QA) datasets.Comment: 10 pages,4 figures,CIK
- …