9,082 research outputs found
Meaning-focused and Quantum-inspired Information Retrieval
In recent years, quantum-based methods have promisingly integrated the
traditional procedures in information retrieval (IR) and natural language
processing (NLP). Inspired by our research on the identification and
application of quantum structures in cognition, more specifically our work on
the representation of concepts and their combinations, we put forward a
'quantum meaning based' framework for structured query retrieval in text
corpora and standardized testing corpora. This scheme for IR rests on
considering as basic notions, (i) 'entities of meaning', e.g., concepts and
their combinations and (ii) traces of such entities of meaning, which is how
documents are considered in this approach. The meaning content of these
'entities of meaning' is reconstructed by solving an 'inverse problem' in the
quantum formalism, consisting of reconstructing the full states of the entities
of meaning from their collapsed states identified as traces in relevant
documents. The advantages with respect to traditional approaches, such as
Latent Semantic Analysis (LSA), are discussed by means of concrete examples.Comment: 11 page
Distributional Sentence Entailment Using Density Matrices
Categorical compositional distributional model of Coecke et al. (2010)
suggests a way to combine grammatical composition of the formal, type logical
models with the corpus based, empirical word representations of distributional
semantics. This paper contributes to the project by expanding the model to also
capture entailment relations. This is achieved by extending the representations
of words from points in meaning space to density operators, which are
probability distributions on the subspaces of the space. A symmetric measure of
similarity and an asymmetric measure of entailment is defined, where lexical
entailment is measured using von Neumann entropy, the quantum variant of
Kullback-Leibler divergence. Lexical entailment, combined with the composition
map on word representations, provides a method to obtain entailment relations
on the level of sentences. Truth theoretic and corpus-based examples are
provided.Comment: 11 page
The Quantum Challenge in Concept Theory and Natural Language Processing
The mathematical formalism of quantum theory has been successfully used in
human cognition to model decision processes and to deliver representations of
human knowledge. As such, quantum cognition inspired tools have improved
technologies for Natural Language Processing and Information Retrieval. In this
paper, we overview the quantum cognition approach developed in our Brussels
team during the last two decades, specifically our identification of quantum
structures in human concepts and language, and the modeling of data from
psychological and corpus-text-based experiments. We discuss our
quantum-theoretic framework for concepts and their conjunctions/disjunctions in
a Fock-Hilbert space structure, adequately modeling a large amount of data
collected on concept combinations. Inspired by this modeling, we put forward
elements for a quantum contextual and meaning-based approach to information
technologies in which 'entities of meaning' are inversely reconstructed from
texts, which are considered as traces of these entities' states.Comment: 5 page
Recommended from our members
"Potentialities or Possibilities": Towards Quantum Information Science?
The use of quantum concepts and formalisms in the information sciences is assessed through an analysis of published literature. Five categories are identified: use of loose analogies and metaphors between concepts in quantum physics and library/information science; use of quantum concepts and formalisms in information retrieval; use of quantum concepts and formalisms in studying meaning and concepts; quantum social science, in areas adjacent to information science; and the qualitative application of quantum concepts in the information disciplines. Quantum issues have led to demonstrable progress in information retrieval and semantic modelling, with less clear-cut progress elsewhere. Whether there may be a future “quantum turn” in the information sciences is debated, the implications of such a turn are considered, and a research agenda outlined
From Frequency to Meaning: Vector Space Models of Semantics
Computers understand very little of the meaning of human language. This
profoundly limits our ability to give instructions to computers, the ability of
computers to explain their actions to us, and the ability of computers to
analyse and process text. Vector space models (VSMs) of semantics are beginning
to address these limits. This paper surveys the use of VSMs for semantic
processing of text. We organize the literature on VSMs according to the
structure of the matrix in a VSM. There are currently three broad classes of
VSMs, based on term-document, word-context, and pair-pattern matrices, yielding
three classes of applications. We survey a broad range of applications in these
three categories and we take a detailed look at a specific open source project
in each category. Our goal in this survey is to show the breadth of
applications of VSMs for semantics, to provide a new perspective on VSMs for
those who are already familiar with the area, and to provide pointers into the
literature for those who are less familiar with the field
Neurocognitive Informatics Manifesto.
Informatics studies all aspects of the structure of natural and artificial information systems. Theoretical and abstract approaches to information have made great advances, but human information processing is still unmatched in many areas, including information management, representation and understanding. Neurocognitive informatics is a new, emerging field that should help to improve the matching of artificial and natural systems, and inspire better computational algorithms to solve problems that are still beyond the reach of machines. In this position paper examples of neurocognitive inspirations and promising directions in this area are given
- …