5 research outputs found

    Constructivist neural network models of cognitive development

    Get PDF
    In this thesis I investigate the modelling of cognitive development with constructivist neural networks. I argue that the constructivist nature of development, that is, the building of a cognitive system through active interactions with its environment, is an essential property of human development and should be considered in models of cognitive development. I evaluate this claim on the basis of evidence from cortical development, cognitive development, and learning theory. In an empirical evaluation of this claim, I then present a constructivist neural network model of the acquisition of the English past tense and of impaired inflectional processing in German agrammatic aphasics. The model displays a realistic course of acquisition, closely modelling the U-shaped learning curve and more detailed effects such as frequency and family effects. Further, the model develops double dissociations between regular and irregular verbs. I argue that the ability of the model to account for the hu..

    Stepwise Evolutionary Training Strategies for Hardware Neural Networks

    Get PDF
    Analog and mixed-signal implementations of artificial neural networks usually lack an exact numerical model due to the unavoidable device variations introduced during manufacturing and the temporal fluctuations in the internal analog signals. Evolutionary algorithms are particularly well suited for the training of such networks since they do not require detailed knowledge of the system to be optimized. In order to make best use of the high network speed, fast and simple training approaches are required. Within the scope of this thesis, a stepwise training approach has been devised that allows for the use of simple evolutionary algorithms to efficiently optimize the synaptic weights of a fast mixed-signal neural network chip. The training strategy is tested on a set of nine well-known classification benchmarks: the breast cancer, diabetes, heart disease, liver disorder, iris plant, wine, glass, E.coli, and yeast data sets. The obtained classification accuracies are shown to be more than competitive to those achieved by software-implemented neural networks and are comparable to the best reported results of other classification algorithms that could be found in literature for these benchmarks. The presented training method is readily suited for a parallel implementation and is fit for use in conjunction with a specialized coprocessor architecture that speeds up evolutionary algorithms by performing the time-consuming genetic operations within a configurable logic. This way, the proposed strategy can fully benefit from the speed of the neural hardware and thus provides efficient means for the training of large networks on the used mixed-signal chip for demanding real-world classification tasks

    MUpstart- A Constructive Neural Network Learning Algorithm for Multi-Category Pattern Classification

    No full text
    Constructive learning algorithms offer an approach for dynamically constructing near-minimal neural network architectures for pattern classification tasks. Several such algorithms proposed in the literature are shown to converge to zero classification errors on finite non-contradictory datasets. However, these algorithms are restricted to twocategory pattern classification and (in most cases) they require the input patterns to have binary (or bipolar) valued attributes only. We present a provably correct extension of the Upstart algorithm to handle multiple output classes and real-valued pattern attributes. Results of experiments with several artificial and real-world datasets demonstrate the feasibility of this approach in practical pattern classification tasks and also suggest several interesting directions for future research. 1

    MUpstart - A Constructive Neural Network Learning Algorithm for Multi-Category Pattern Classification

    No full text
    Constructive learning algorithms offer an approach for dynamically constructing near-minimal neural network architectures for pattern classification tasks. Several such algorithms proposed in the literature are shown to converge to zero classification errors on finite non-contradictory datasets. However, these algorithms are restricted to twocategory pattern classification and (in most cases) they require the input patterns to have binary (or bipolar) valued attributes only. We present a provably correct extension of the Upstart algorithm to handle multiple output classes and real-valued pattern attributes. Results of experiments with several artificial and real-world datasets demonstrate the feasibility of this approach in practical pattern classification tasks and also suggest several interesting directions for future research. 1. Introduction Multi-layer networks of threshold logic units (also called threshold neurons or TLU) or multi-layer perceptrons (MLP) offer an attractive fra..
    corecore