1,137 research outputs found

    Born to learn: The inspiration, progress, and future of evolved plastic artificial neural networks

    Get PDF
    Biological plastic neural networks are systems of extraordinary computational capabilities shaped by evolution, development, and lifetime learning. The interplay of these elements leads to the emergence of adaptive behavior and intelligence. Inspired by such intricate natural phenomena, Evolved Plastic Artificial Neural Networks (EPANNs) use simulated evolution in-silico to breed plastic neural networks with a large variety of dynamics, architectures, and plasticity rules: these artificial systems are composed of inputs, outputs, and plastic components that change in response to experiences in an environment. These systems may autonomously discover novel adaptive algorithms, and lead to hypotheses on the emergence of biological adaptation. EPANNs have seen considerable progress over the last two decades. Current scientific and technological advances in artificial neural networks are now setting the conditions for radically new approaches and results. In particular, the limitations of hand-designed networks could be overcome by more flexible and innovative solutions. This paper brings together a variety of inspiring ideas that define the field of EPANNs. The main methods and results are reviewed. Finally, new opportunities and developments are presented

    An Interclass Margin Maximization Learning Algorithm for Evolving Spiking Neural Network

    Get PDF
    This paper presents a new learning algorithm developed for a three layered spiking neural network for pattern classification problems. The learning algorithm maximizes the interclass margin and is referred to as the two stage margin maximization spiking neural network (TMM-SNN). In the structure learning stage, the learning algorithm completely evolves the hidden layer neurons in the first epoch. Further, TMM-SNN updates the weights of the hidden neurons for multiple epochs using the newly developed normalized membrane potential learning rule such that the interclass margins (based on the response of hidden neurons) are maximized. The normalized membrane potential learning rule considers both the local information in the spike train generated by a presynaptic neuron and the existing knowledge (synaptic weights) stored in the network to update the synaptic weights. After the first stage, the number of hidden neurons and their parameters are not updated. In the output weights learning stage, TMM-SNN updates the weights of the output layer neurons for multiple epochs to maximize the interclass margins (based on the response of output neurons). Performance of TMM-SNN is evaluated using ten benchmark data sets from the UCI machine learning repository. Statistical performance comparison of TMM-SNN with other existing learning algorithms for SNNs is conducted using the nonparametric Friedman test followed by a pairwise comparison using the Fisher's least significant difference method. The results clearly indicate that TMM-SNN achieves better generalization performance in comparison to other algorithms

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system

    Can biological quantum networks solve NP-hard problems?

    Full text link
    There is a widespread view that the human brain is so complex that it cannot be efficiently simulated by universal Turing machines. During the last decades the question has therefore been raised whether we need to consider quantum effects to explain the imagined cognitive power of a conscious mind. This paper presents a personal view of several fields of philosophy and computational neurobiology in an attempt to suggest a realistic picture of how the brain might work as a basis for perception, consciousness and cognition. The purpose is to be able to identify and evaluate instances where quantum effects might play a significant role in cognitive processes. Not surprisingly, the conclusion is that quantum-enhanced cognition and intelligence are very unlikely to be found in biological brains. Quantum effects may certainly influence the functionality of various components and signalling pathways at the molecular level in the brain network, like ion ports, synapses, sensors, and enzymes. This might evidently influence the functionality of some nodes and perhaps even the overall intelligence of the brain network, but hardly give it any dramatically enhanced functionality. So, the conclusion is that biological quantum networks can only approximately solve small instances of NP-hard problems. On the other hand, artificial intelligence and machine learning implemented in complex dynamical systems based on genuine quantum networks can certainly be expected to show enhanced performance and quantum advantage compared with classical networks. Nevertheless, even quantum networks can only be expected to efficiently solve NP-hard problems approximately. In the end it is a question of precision - Nature is approximate.Comment: 38 page
    corecore