1,151 research outputs found
Towards logical negation for compositional distributional semantics
The categorical compositional distributional model of meaning gives the
composition of words into phrases and sentences pride of place. However, it has
so far lacked a model of logical negation. This paper gives some steps towards
providing this operator, modelling it as a version of projection onto the
subspace orthogonal to a word. We give a small demonstration of the operators
performance in a sentence entailment task
Grounded learning for compositional vector semantics
Categorical compositional distributional semantics is an approach to
modelling language that combines the success of vector-based models of meaning
with the compositional power of formal semantics. However, this approach was
developed without an eye to cognitive plausibility. Vector representations of
concepts and concept binding are also of interest in cognitive science, and
have been proposed as a way of representing concepts within a biologically
plausible spiking neural network. This work proposes a way for compositional
distributional semantics to be implemented within a spiking neural network
architecture, with the potential to address problems in concept binding, and
give a small implementation. We also describe a means of training word
representations using labelled images
- …