7 research outputs found
From Logical to Distributional Models
The paper relates two variants of semantic models for natural language,
logical functional models and compositional distributional vector space models,
by transferring the logic and reasoning from the logical to the distributional
models.
The geometrical operations of quantum logic are reformulated as algebraic
operations on vectors. A map from functional models to vector space models
makes it possible to compare the meaning of sentences word by word.Comment: In Proceedings QPL 2013, arXiv:1412.791
Translating and Evolving: Towards a Model of Language Change in DisCoCat
The categorical compositional distributional (DisCoCat) model of meaning
developed by Coecke et al. (2010) has been successful in modeling various
aspects of meaning. However, it fails to model the fact that language can
change. We give an approach to DisCoCat that allows us to represent language
models and translations between them, enabling us to describe translations from
one language to another, or changes within the same language. We unify the
product space representation given in (Coecke et al., 2010) and the functorial
description in (Kartsaklis et al., 2013), in a way that allows us to view a
language as a catalogue of meanings. We formalize the notion of a lexicon in
DisCoCat, and define a dictionary of meanings between two lexicons. All this is
done within the framework of monoidal categories. We give examples of how to
apply our methods, and give a concrete suggestion for compositional translation
in corpora.Comment: In Proceedings CAPNS 2018, arXiv:1811.0270
A Generalised Quantifier Theory of Natural Language in Categorical Compositional Distributional Semantics with Bialgebras
Categorical compositional distributional semantics is a model of natural
language; it combines the statistical vector space models of words with the
compositional models of grammar. We formalise in this model the generalised
quantifier theory of natural language, due to Barwise and Cooper. The
underlying setting is a compact closed category with bialgebras. We start from
a generative grammar formalisation and develop an abstract categorical
compositional semantics for it, then instantiate the abstract setting to sets
and relations and to finite dimensional vector spaces and linear maps. We prove
the equivalence of the relational instantiation to the truth theoretic
semantics of generalised quantifiers. The vector space instantiation formalises
the statistical usages of words and enables us to, for the first time, reason
about quantified phrases and sentences compositionally in distributional
semantics
Towards Functorial Language-Games
In categorical compositional semantics of natural language one studies
functors from a category of grammatical derivations (such as a Lambek pregroup)
to a semantic category (such as real vector spaces). We compositionally build
game-theoretic semantics of sentences by taking the semantic category to be the
category whose morphisms are open games. This requires some modifications to
the grammar category to compensate for the failure of open games to form a
compact closed category. We illustrate the theory using simple examples of
Wittgenstein's language-games.Comment: In Proceedings CAPNS 2018, arXiv:1811.0270