10 research outputs found
Recommended from our members
End-to-End Quantum-like Language Models with Application to Question Answering
Language Modeling (LM) is a fundamental research topic ina range of areas. Recently, inspired by quantum theory, a novel Quantum Language Model (QLM) has been proposed for Information Retrieval (IR). In this paper, we aim to broaden the theoretical and practical basis of QLM. We develop a Neural Network based Quantum-like Language Model (NNQLM) and apply it to Question Answering. Specifically, based on word embeddings, we design a new density matrix, which represents a sentence (e.g., a question or an answer) and encodes a mixture of semantic subspaces. Such a density matrix, together with a joint representation of the question and the answer, can be integrated into neural network architectures (e.g., 2-dimensional convolutional neural networks). Experiments on the TREC-QA and WIKIQA datasets have verified the effectiveness of our proposed models
Distributional Sentence Entailment Using Density Matrices
Categorical compositional distributional model of Coecke et al. (2010)
suggests a way to combine grammatical composition of the formal, type logical
models with the corpus based, empirical word representations of distributional
semantics. This paper contributes to the project by expanding the model to also
capture entailment relations. This is achieved by extending the representations
of words from points in meaning space to density operators, which are
probability distributions on the subspaces of the space. A symmetric measure of
similarity and an asymmetric measure of entailment is defined, where lexical
entailment is measured using von Neumann entropy, the quantum variant of
Kullback-Leibler divergence. Lexical entailment, combined with the composition
map on word representations, provides a method to obtain entailment relations
on the level of sentences. Truth theoretic and corpus-based examples are
provided.Comment: 11 page
A Quantum Many-body Wave Function Inspired Language Modeling Approach
The recently proposed quantum language model (QLM) aimed at a principled
approach to modeling term dependency by applying the quantum probability
theory. The latest development for a more effective QLM has adopted word
embeddings as a kind of global dependency information and integrated the
quantum-inspired idea in a neural network architecture. While these
quantum-inspired LMs are theoretically more general and also practically
effective, they have two major limitations. First, they have not taken into
account the interaction among words with multiple meanings, which is common and
important in understanding natural language text. Second, the integration of
the quantum-inspired LM with the neural network was mainly for effective
training of parameters, yet lacking a theoretical foundation accounting for
such integration. To address these two issues, in this paper, we propose a
Quantum Many-body Wave Function (QMWF) inspired language modeling approach. The
QMWF inspired LM can adopt the tensor product to model the aforesaid
interaction among words. It also enables us to reveal the inherent necessity of
using Convolutional Neural Network (CNN) in QMWF language modeling.
Furthermore, our approach delivers a simple algorithm to represent and match
text/sentence pairs. Systematic evaluation shows the effectiveness of the
proposed QMWF-LM algorithm, in comparison with the state of the art
quantum-inspired LMs and a couple of CNN-based methods, on three typical
Question Answering (QA) datasets.Comment: 10 pages,4 figures,CIK
A Survey of Quantum Theory Inspired Approaches to Information Retrieval
Since 2004, researchers have been using the mathematical framework of Quantum Theory (QT) in Information Retrieval (IR). QT offers a generalized probability and logic framework. Such a framework has been shown capable of unifying the representation, ranking and user cognitive aspects of IR, and helpful in developing more dynamic, adaptive and context-aware IR systems. Although Quantum-inspired IR is still a growing area, a wide array of work in different aspects of IR has been done and produced promising results. This paper presents a survey of the research done in this area, aiming to show the landscape of the field and draw a road-map of future directions
Compositional Distributional Semantics with Compact Closed Categories and Frobenius Algebras
This thesis contributes to ongoing research related to the categorical
compositional model for natural language of Coecke, Sadrzadeh and Clark in
three ways: Firstly, I propose a concrete instantiation of the abstract
framework based on Frobenius algebras (joint work with Sadrzadeh). The theory
improves shortcomings of previous proposals, extends the coverage of the
language, and is supported by experimental work that improves existing results.
The proposed framework describes a new class of compositional models that find
intuitive interpretations for a number of linguistic phenomena. Secondly, I
propose and evaluate in practice a new compositional methodology which
explicitly deals with the different levels of lexical ambiguity (joint work
with Pulman). A concrete algorithm is presented, based on the separation of
vector disambiguation from composition in an explicit prior step. Extensive
experimental work shows that the proposed methodology indeed results in more
accurate composite representations for the framework of Coecke et al. in
particular and every other class of compositional models in general. As a last
contribution, I formalize the explicit treatment of lexical ambiguity in the
context of the categorical framework by resorting to categorical quantum
mechanics (joint work with Coecke). In the proposed extension, the concept of a
distributional vector is replaced with that of a density matrix, which
compactly represents a probability distribution over the potential different
meanings of the specific word. Composition takes the form of quantum
measurements, leading to interesting analogies between quantum physics and
linguistics.Comment: Ph.D. Dissertation, University of Oxfor