31 research outputs found

    Usage-based Grammar Learning as Insight Problem Solving

    Get PDF
    Trabajo presentado en la EAPCogSci 2015, EuroAsianPacific Joint Conference on Cognitive Science (4th European Conference on Cognitive Science y 11th International Conference on Cognitive Science), celebrada en Turín del 25 al 27 de septiembre de 2015.We report on computational experiments in which a learning agent incrementally acquires grammar from a tutoring agent through situated embodied interactions. The learner is able to detect impasses in routine language processing, such as missing a grammatical construction to integrate a word in the rest of the sentence structure, to move to a meta-level to repair these impasses, primarily based on semantics, and to then expand or restructure his grammar using insights gained from repairs. The paper proposes a cognitive architecture able to support this kind of insight learning and tests it on a grammar learning task.The research reported here has been funded by an ICREA Research fellowship to LS and a Marie Curie Integration Grant EVOLAN. The project has received further funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 308943, which is the FET-OPEN Insight project.Peer reviewe

    A vector representation of Fluid Construction Grammar using Holographic Reduced Representations

    Get PDF
    Trabajo presentado en la EAPCogSci 2015, EuroAsianPacific Joint Conference on Cognitive Science (4th European Conference on Cognitive Science y 11th International Conference on Cognitive Science), celebrada en Turín del 25 al 27 de septiembre de 2015.The question of how symbol systems can be instantiated in neural network-like computation is still open. Many technical challenges remain and most proposals do not scale up to realistic examples of symbol processing, for example, language un- derstanding or language production. Here we use a top-down approach. We start from Fluid Construction Grammar, a well- worked out framework for language processing that is compatible with recent insights into Construction Grammar and inves- tigate how we could build a neural compiler that automatically translates grammatical constructions and grammatical processing into neural computations. We proceed in two steps. FCG is translated from symbolic processing to numeric processing using a vector symbolic architecture, and this numeric processing is then translated into neural network computation. Our experiments are still in an early stage but already show promise.Research reported in this paper was funded by the Marie Curie ESSENCE ITN and carried out at the AI lab, Vrije Universiteit Brussel and the Institut de Biologia Evolutiva (UPF-CSIC), Barcelona, financed by the FET OPEN Insight Project and the Marie Curie Integration Grant EVOLAN.Peer reviewe

    Arguing by metaphors

    Get PDF
    corecore