7,126 research outputs found
A probabilistic examplar based model
A central problem in case based reasoning (CBR) is how to store and retrievecases. One approach to this problem is to use exemplar based models, where onlythe prototypical cases are stored. However, the development of an exemplar basedmodel (EBM) requires the solution of several problems: (i) how can a EBM berepresented? (ii) given a new case, how can a suitable exemplar be retrieved? (iii)what makes a good exemplar? (iv) how can an EBM be learned incrementally?This thesis develops a new model, called a probabilistic exemplar based model,that addresses these research questions. The model utilizes Bayesian networksto develop a suitable representation and uses probability theory to develop thefoundations of the developed model. A probability propagation method is usedto retrieve exemplars when a new case is presented and for assessing the prototypicalityof an exemplar.The model learns incrementally by revising the exemplars retained and byupdating the conditional probabilities required by the Bayesian network. Theproblem of ignorance, encountered when only a few cases have been observed,is tackled by introducing the concept of a virtual exemplar to represent all theunseen cases.The model is implemented in C and evaluated on three datasets. It is alsocontrasted with related work in CBR and machine learning (ML)
A New Fundamental Evidence of Non-Classical Structure in the Combination of Natural Concepts
We recently performed cognitive experiments on conjunctions and negations of
two concepts with the aim of investigating the combination problem of concepts.
Our experiments confirmed the deviations (conceptual vagueness, underextension,
overextension, etc.) from the rules of classical (fuzzy) logic and probability
theory observed by several scholars in concept theory, while our data were
successfully modeled in a quantum-theoretic framework developed by ourselves.
In this paper, we isolate a new, very stable and systematic pattern of
violation of classicality that occurs in concept combinations. In addition, the
strength and regularity of this non-classical effect leads us to believe that
it occurs at a more fundamental level than the deviations observed up to now.
It is our opinion that we have identified a deep non-classical mechanism
determining not only how concepts are combined but, rather, how they are
formed. We show that this effect can be faithfully modeled in a two-sector Fock
space structure, and that it can be exactly explained by assuming that human
thought is the supersposition of two processes, a 'logical reasoning', guided
by 'logic', and a 'conceptual reasoning' guided by 'emergence', and that the
latter generally prevails over the former. All these findings provide a new
fundamental support to our quantum-theoretic approach to human cognition.Comment: 14 pages. arXiv admin note: substantial text overlap with
arXiv:1503.0426
Using Wittgenstein’s family resemblance principle to learn exemplars
The introduction of the notion of family resemblance represented a major shift in Wittgenstein’s thoughts on the meaning of words, moving away from a belief that words
were well defined, to a view that words denoted less well defined categories of meaning.
This paper presents the use of the notion of family resemblance in the area of machine learning as an example of the benefits that can accrue from adopting the kind of paradigm shift taken by Wittgenstein. The paper presents a model capable of learning exemplars using the principle of family resemblance and adopting Bayesian networks for a representation of exemplars. An empirical evaluation is presented on three data sets and shows promising results that suggest that previous assumptions about the way we categories need reopening
Concepts and Their Dynamics: A Quantum-Theoretic Modeling of Human Thought
We analyze different aspects of our quantum modeling approach of human
concepts, and more specifically focus on the quantum effects of contextuality,
interference, entanglement and emergence, illustrating how each of them makes
its appearance in specific situations of the dynamics of human concepts and
their combinations. We point out the relation of our approach, which is based
on an ontology of a concept as an entity in a state changing under influence of
a context, with the main traditional concept theories, i.e. prototype theory,
exemplar theory and theory theory. We ponder about the question why quantum
theory performs so well in its modeling of human concepts, and shed light on
this question by analyzing the role of complex amplitudes, showing how they
allow to describe interference in the statistics of measurement outcomes, while
in the traditional theories statistics of outcomes originates in classical
probability weights, without the possibility of interference. The relevance of
complex numbers, the appearance of entanglement, and the role of Fock space in
explaining contextual emergence, all as unique features of the quantum
modeling, are explicitly revealed in this paper by analyzing human concepts and
their dynamics.Comment: 31 pages, 5 figure
Quantum Structure in Cognition, Origins, Developments, Successes and Expectations
We provide an overview of the results we have attained in the last decade on
the identification of quantum structures in cognition and, more specifically,
in the formalization and representation of natural concepts. We firstly discuss
the quantum foundational reasons that led us to investigate the mechanisms of
formation and combination of concepts in human reasoning, starting from the
empirically observed deviations from classical logical and probabilistic
structures. We then develop our quantum-theoretic perspective in Fock space
which allows successful modeling of various sets of cognitive experiments
collected by different scientists, including ourselves. In addition, we
formulate a unified explanatory hypothesis for the presence of quantum
structures in cognitive processes, and discuss our recent discovery of further
quantum aspects in concept combinations, namely, 'entanglement' and
'indistinguishability'. We finally illustrate perspectives for future research.Comment: 25 pages. arXiv admin note: text overlap with arXiv:1412.870
Heterogeneous Proxytypes Extended: Integrating Theory-like Representations and Mechanisms with Prototypes and Exemplars
The paper introduces an extension of the proposal according to which
conceptual representations in cognitive agents should be intended as heterogeneous
proxytypes. The main contribution of this paper is in that it details how
to reconcile, under a heterogeneous representational perspective, different theories
of typicality about conceptual representation and reasoning. In particular, it
provides a novel theoretical hypothesis - as well as a novel categorization algorithm
called DELTA - showing how to integrate the representational and reasoning
assumptions of the theory-theory of concepts with the those ascribed to the
prototype and exemplars-based theories
Enhanced tracking and recognition of moving objects by reasoning about spatio-temporal continuity.
A framework for the logical and statistical analysis and annotation of dynamic scenes containing occlusion and other uncertainties is presented. This framework consists
of three elements; an object tracker module, an object recognition/classification module and a logical consistency, ambiguity and error reasoning engine. The principle behind the object tracker and object recognition modules is to reduce error by increasing ambiguity (by merging objects in close proximity and presenting multiple
hypotheses). The reasoning engine deals with error, ambiguity and occlusion in a unified framework to produce a hypothesis that satisfies fundamental constraints
on the spatio-temporal continuity of objects. Our algorithm finds a globally consistent model of an extended video sequence that is maximally supported by a voting function based on the output of a statistical classifier. The system results
in an annotation that is significantly more accurate than what would be obtained
by frame-by-frame evaluation of the classifier output. The framework has been implemented
and applied successfully to the analysis of team sports with a single
camera.
Key words: Visua
Hume's Legacy: A Cognitive Science Perspective
Hume is an experimental philosopher who attempts to understand why we think, feel, and act as we do. But how should we evaluate the adequacy of his proposals? This chapter examines Hume’s account from the perspective of interdisciplinary work in cognitive science
- …