874 research outputs found
A probabilistic framework for analysing the compositionality of conceptual combinations
Conceptual combination performs a fundamental role in creating the broad
range of compound phrases utilised in everyday language. This article provides
a novel probabilistic framework for assessing whether the semantics of conceptual
combinations are compositional, and so can be considered as a function of
the semantics of the constituent concepts, or not. While the systematicity and
productivity of language provide a strong argument in favor of assuming compositionality,
this very assumption is still regularly questioned in both cognitive
science and philosophy. Additionally, the principle of semantic compositionality
is underspecified, which means that notions of both "strong" and "weak"
compositionality appear in the literature. Rather than adjudicating between
different grades of compositionality, the framework presented here contributes
formal methods for determining a clear dividing line between compositional and
non-compositional semantics. In addition, we suggest that the distinction between
these is contextually sensitive. Compositionality is equated with a joint probability distribution modeling how the constituent concepts in the combination
are interpreted. Marginal selectivity is introduced as a pivotal probabilistic
constraint for the application of the Bell/CH and CHSH systems of inequalities.
Non-compositionality is equated with a failure of marginal selectivity, or violation
of either system of inequalities in the presence of marginal selectivity. This
means that the conceptual combination cannot be modeled in a joint probability
distribution, the variables of which correspond to how the constituent concepts
are being interpreted. The formal analysis methods are demonstrated by applying
them to an empirical illustration of twenty-four non-lexicalised conceptual
combinations
A generic operational metatheory for algebraic effects
We provide a syntactic analysis of contextual preorder and equivalence for a polymorphic programming language with effects. Our approach applies uniformly across a range of algebraic effects, and incorporates, as instances: errors, input/output, global state, nondeterminism, probabilistic choice, and combinations thereof. Our approach is to extend Plotkin and Power’s structural operational semantics for algebraic effects (FoSSaCS 2001) with a primitive “basic preorder” on ground type computation trees. The basic preorder is used to derive notions of contextual preorder and equivalence on program terms. Under mild assumptions on this relation, we prove fundamental properties of contextual preorder (hence equivalence) including extensionality properties and a characterisation via applicative contexts, and we provide machinery for reasoning about polymorphism using relational parametricity
Representing Concepts by Weighted Formulas
A concept is traditionally defined via the necessary and sufficient conditions
that clearly determine its extension. By contrast, cognitive views of concepts
intend to account for empirical data that show that categorisation under a concept
presents typicality effects and a certain degree of indeterminacy. We propose a formal
language to compactly represent concepts by leveraging on weighted logical
formulas. In this way, we can model the possible synergies among the qualities that
are relevant for categorising an object under a concept. We show that our proposal
can account for a number of views of concepts such as the prototype theory and the
exemplar theory. Moreover, we show how the proposed model can overcome some
limitations of cognitive views
Modelling contextuality by probabilistic programs with hypergraph semantics
Models of a phenomenon are often developed by examining it under different
experimental conditions, or measurement contexts. The resultant probabilistic
models assume that the underlying random variables, which define a measurable
set of outcomes, can be defined independent of the measurement context. The
phenomenon is deemed contextual when this assumption fails. Contextuality is an
important issue in quantum physics. However, there has been growing speculation
that it manifests outside the quantum realm with human cognition being a
particularly prominent area of investigation. This article contributes the
foundations of a probabilistic programming language that allows convenient
exploration of contextuality in wide range of applications relevant to
cognitive science and artificial intelligence. Specific syntax is proposed to
allow the specification of "measurement contexts". Each such context delivers a
partial model of the phenomenon based on the associated experimental condition
described by the measurement context. The probabilistic program is translated
into a hypergraph in a modular way. Recent theoretical results from the field
of quantum physics show that contextuality can be equated with the possibility
of constructing a probabilistic model on the resulting hypergraph. The use of
hypergraphs opens the door for a theoretically succinct and efficient
computational semantics sensitive to modelling both contextual and
non-contextual phenomena. Finally, this article raises awareness of
contextuality beyond quantum physics and to contribute formal methods to detect
its presence by means of hypergraph semantics.Comment: Accepted for "Theoretical Computer Science
Symbol Emergence in Robotics: A Survey
Humans can learn the use of language through physical interaction with their
environment and semiotic communication with other people. It is very important
to obtain a computational understanding of how humans can form a symbol system
and obtain semiotic skills through their autonomous mental development.
Recently, many studies have been conducted on the construction of robotic
systems and machine-learning methods that can learn the use of language through
embodied multimodal interaction with their environment and other systems.
Understanding human social interactions and developing a robot that can
smoothly communicate with human users in the long term, requires an
understanding of the dynamics of symbol systems and is crucially important. The
embodied cognition and social interaction of participants gradually change a
symbol system in a constructive manner. In this paper, we introduce a field of
research called symbol emergence in robotics (SER). SER is a constructive
approach towards an emergent symbol system. The emergent symbol system is
socially self-organized through both semiotic communications and physical
interactions with autonomous cognitive developmental agents, i.e., humans and
developmental robots. Specifically, we describe some state-of-art research
topics concerning SER, e.g., multimodal categorization, word discovery, and a
double articulation analysis, that enable a robot to obtain words and their
embodied meanings from raw sensory--motor information, including visual
information, haptic information, auditory information, and acoustic speech
signals, in a totally unsupervised manner. Finally, we suggest future
directions of research in SER.Comment: submitted to Advanced Robotic
Recommended from our members
AXEL: A framework to deal with ambiguity in three-noun compounds
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University, 6/12/2010.Cognitive Linguistics has been widely used to deal with the ambiguity generated by words in combination. Although this domain offers many solutions to address this challenge, not all of them can be implemented in a computational environment. The Dynamic Construal of Meaning framework is argued to have this ability because it describes an intrinsic degree of association of meanings, which in turn, can be translated into computational programs. A limitation towards a computational approach, however, has been the lack of syntactic parameters. This research argues that this limitation could be overcome with the aid of the Generative Lexicon Theory (GLT). Specifically, this dissertation formulated possible means to marry the GLT and Cognitive Linguistics in a novel rapprochement between the two.
This bond between opposing theories provided the means to design a computational template (the AXEL System) by realising syntax and semantics at software levels. An instance of the AXEL system was created using a Design Research approach. Planned iterations were involved in the development to improve artefact performance. Such iterations boosted performance-improving, which accounted for the degree of association of meanings in three-noun compounds.
This dissertation delivered three major contributions on the brink of a so-called turning point in Computational Linguistics (CL). First, the AXEL system was used to disclose hidden lexical patterns on ambiguity. These patterns are difficult, if not impossible, to be identified without automatic techniques. This research claimed that these patterns can assist audiences of linguists to review lexical knowledge on a software-based viewpoint.
Following linguistic awareness, the second result advocated for the adoption of improved resources by decreasing electronic space of Sense Enumerative Lexicons (SELs). The AXEL system deployed the generation of “at the moment of use” interpretations, optimising the way the space is needed for lexical storage.
Finally, this research introduced a subsystem of metrics to characterise an ambiguous degree of association of three-noun compounds enabling ranking methods. Weighing methods delivered mechanisms of classification of meanings towards Word Sense Disambiguation (WSD). Overall these results attempted to tackle difficulties in understanding studies of Lexical Semantics via software tools
- …