4,611 research outputs found
Logic-Based Analogical Reasoning and Learning
Analogy-making is at the core of human intelligence and creativity with
applications to such diverse tasks as commonsense reasoning, learning, language
acquisition, and story telling. This paper contributes to the foundations of
artificial general intelligence by developing an abstract algebraic framework
for logic-based analogical reasoning and learning in the setting of logic
programming. The main idea is to define analogy in terms of modularity and to
derive abstract forms of concrete programs from a `known' source domain which
can then be instantiated in an `unknown' target domain to obtain analogous
programs. To this end, we introduce algebraic operations for syntactic program
composition and concatenation and illustrate, by giving numerous examples, that
programs have nice decompositions. Moreover, we show how composition gives rise
to a qualitative notion of syntactic program similarity. We then argue that
reasoning and learning by analogy is the task of solving analogical proportions
between logic programs. Interestingly, our work suggests a close relationship
between modularity, generalization, and analogy which we believe should be
explored further in the future. In a broader sense, this paper is a first step
towards an algebraic and mainly syntactic theory of logic-based analogical
reasoning and learning in knowledge representation and reasoning systems, with
potential applications to fundamental AI-problems like commonsense reasoning
and computational learning and creativity
Ranking relations using analogies in biological and information networks
Analogical reasoning depends fundamentally on the ability to learn and
generalize about relations between objects. We develop an approach to
relational learning which, given a set of pairs of objects
,
measures how well other pairs A:B fit in with the set . Our work
addresses the following question: is the relation between objects A and B
analogous to those relations found in ? Such questions are
particularly relevant in information retrieval, where an investigator might
want to search for analogous pairs of objects that match the query set of
interest. There are many ways in which objects can be related, making the task
of measuring analogies very challenging. Our approach combines a similarity
measure on function spaces with Bayesian analysis to produce a ranking. It
requires data containing features of the objects of interest and a link matrix
specifying which relationships exist; no further attributes of such
relationships are necessary. We illustrate the potential of our method on text
analysis and information networks. An application on discovering functional
interactions between pairs of proteins is discussed in detail, where we show
that our approach can work in practice even if a small set of protein pairs is
provided.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS321 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Specialization of the rostral prefrontal cortex for distinct analogy processes
Analogical reasoning is central to learning and abstract thinking. It involves using a more familiar situation (source) to make inferences about a less familiar situation (target). According to the predominant cognitive models, analogical reasoning includes 1) generation of structured mental representations and 2) mapping based on structural similarities between them. This study used functional magnetic resonance imaging to specify the role of rostral prefrontal cortex (PFC) in these distinct processes. An experimental paradigm was designed that enabled differentiation between these processes, by temporal separation of the presentation of the source and the target. Within rostral PFC, a lateral subregion was activated by analogy task both during study of the source (before the source could be compared with a target) and when the target appeared. This may suggest that this subregion supports fundamental analogy processes such as generating structured representations of stimuli but is not specific to one particular processing stage. By contrast, a dorsomedial subregion of rostral PFC showed an interaction between task (analogy vs. control) and period (more activated when the target appeared). We propose that this region is involved in comparison or mapping processes. These results add to the growing evidence for functional differentiation between rostral PFC subregions
Systematicity and surface similarity in the development of analogy
In split page format (number of pages: 45)Includes bibliographical reference
Training neural networks to encode symbols enables combinatorial generalization
Combinatorial generalization - the ability to understand and produce novel
combinations of already familiar elements - is considered to be a core capacity
of the human mind and a major challenge to neural network models. A significant
body of research suggests that conventional neural networks can't solve this
problem unless they are endowed with mechanisms specifically engineered for the
purpose of representing symbols. In this paper we introduce a novel way of
representing symbolic structures in connectionist terms - the vectors approach
to representing symbols (VARS), which allows training standard neural
architectures to encode symbolic knowledge explicitly at their output layers.
In two simulations, we show that neural networks not only can learn to produce
VARS representations, but in doing so they achieve combinatorial generalization
in their symbolic and non-symbolic output. This adds to other recent work that
has shown improved combinatorial generalization under specific training
conditions, and raises the question of whether specific mechanisms or training
routines are needed to support symbolic processing
Proportional algebras, homomorphisms, congruences, and functors
This paper introduces proportional algebras as algebras endowed with the
4-ary analogical proportion relation where the fundamental concepts of
subalgebras, homomorphisms, congruences, and functors are constructed
- ā¦