3 research outputs found

    Constructivist neural network models of cognitive development

    Get PDF
    In this thesis I investigate the modelling of cognitive development with constructivist neural networks. I argue that the constructivist nature of development, that is, the building of a cognitive system through active interactions with its environment, is an essential property of human development and should be considered in models of cognitive development. I evaluate this claim on the basis of evidence from cortical development, cognitive development, and learning theory. In an empirical evaluation of this claim, I then present a constructivist neural network model of the acquisition of the English past tense and of impaired inflectional processing in German agrammatic aphasics. The model displays a realistic course of acquisition, closely modelling the U-shaped learning curve and more detailed effects such as frequency and family effects. Further, the model develops double dissociations between regular and irregular verbs. I argue that the ability of the model to account for the hu..

    Constructive learning: Inducing grammars and neural networks

    Get PDF
    This dissertation focuses on two important areas of machine learning research--regular grammar inference and constructive neural network learning algorithms;Regular grammar inference is the process of learning a target regular grammar or equivalently a deterministic finite state automaton (DFA) from labeled examples. We focus on the design of efficient algorithms for learning DFA where the learner is provided with a representative set of examples for the target concept and additionally might be guided by a teacher who answers membership queries. DFA learning algorithms typically map a given structurally complete set of examples to a lattice of finite state automata. Explicit enumeration of this lattice is practically infeasible. We propose a framework for implicitly representing the lattice as a version space and design a provably correct search algorithm for identifying the target DFA. Incremental or online learning algorithms are important in scenarios where all the training examples might not be available to the learner at the start. We develop a provably correct polynomial time incremental algorithm for learning DFA from labeled examples and membership queries. PAC learnability of DFA under restricted classes of distributions is an open research problem. We solve this problem by proving that DFA are efficiently PAC learnable under the class of simple distributions;Constructive neural network learning algorithms offer an interesting approach for incremental construction of near minimal neural network architectures for pattern classification and inductive knowledge acquisition. The existing constructive learning algorithms were designed for two category pattern classification and assumed that the patterns have binary (or bipolar) valued attributes. We propose a framework for extending constructive learning algorithms to handle multiple output classes and real-valued attributes. Further, with carefully designed experimental studies we attempt to characterize the inductive bias of these algorithms. Owing to the limited training time and the inherent representational bias, these algorithms tend to construct networks with redundant elements. We develop pruning strategies for elimination of redundant neurons in MTiling based constructive networks. Experimental results show that pruning brings about a modest to significant reduction in network size. Finally, we demonstrate the applicability of constructive learning algorithms in the area of connectionist theory refinement

    Constructive learning: inducing grammars and neural networks

    No full text
    This dissertation focuses on two important areas of machine learning research--regular grammar inference and constructive neural network learning algorithms;Regular grammar inference is the process of learning a target regular grammar or equivalently a deterministic finite state automaton (DFA) from labeled examples. We focus on the design of efficient algorithms for learning DFA where the learner is provided with a representative set of examples for the target concept and additionally might be guided by a teacher who answers membership queries. DFA learning algorithms typically map a given structurally complete set of examples to a lattice of finite state automata. Explicit enumeration of this lattice is practically infeasible. We propose a framework for implicitly representing the lattice as a version space and design a provably correct search algorithm for identifying the target DFA. Incremental or online learning algorithms are important in scenarios where all the training examples might not be available to the learner at the start. We develop a provably correct polynomial time incremental algorithm for learning DFA from labeled examples and membership queries. PAC learnability of DFA under restricted classes of distributions is an open research problem. We solve this problem by proving that DFA are efficiently PAC learnable under the class of simple distributions;Constructive neural network learning algorithms offer an interesting approach for incremental construction of near minimal neural network architectures for pattern classification and inductive knowledge acquisition. The existing constructive learning algorithms were designed for two category pattern classification and assumed that the patterns have binary (or bipolar) valued attributes. We propose a framework for extending constructive learning algorithms to handle multiple output classes and real-valued attributes. Further, with carefully designed experimental studies we attempt to characterize the inductive bias of these algorithms. Owing to the limited training time and the inherent representational bias, these algorithms tend to construct networks with redundant elements. We develop pruning strategies for elimination of redundant neurons in MTiling based constructive networks. Experimental results show that pruning brings about a modest to significant reduction in network size. Finally, we demonstrate the applicability of constructive learning algorithms in the area of connectionist theory refinement.</p
    corecore