19,036 research outputs found

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality

    The Performance of Associative Memory Models with Biologically Inspired Connectivity

    Get PDF
    This thesis is concerned with one important question in artificial neural networks, that is, how biologically inspired connectivity of a network affects its associative memory performance. In recent years, research on the mammalian cerebral cortex, which has the main responsibility for the associative memory function in the brains, suggests that the connectivity of this cortical network is far from fully connected, which is commonly assumed in traditional associative memory models. It is found to be a sparse network with interesting connectivity characteristics such as the “small world network” characteristics, represented by short Mean Path Length, high Clustering Coefficient, and high Global and Local Efficiency. Most of the networks in this thesis are therefore sparsely connected. There is, however, no conclusive evidence of how these different connectivity characteristics affect the associative memory performance of a network. This thesis addresses this question using networks with different types of connectivity, which are inspired from biological evidences. The findings of this programme are unexpected and important. Results show that the performance of a non-spiking associative memory model is found to be predicted by its linear correlation with the Clustering Coefficient of the network, regardless of the detailed connectivity patterns. This is particularly important because the Clustering Coefficient is a static measure of one aspect of connectivity, whilst the associative memory performance reflects the result of a complex dynamic process. On the other hand, this research reveals that improvements in the performance of a network do not necessarily directly rely on an increase in the network’s wiring cost. Therefore it is possible to construct networks with high associative memory performance but relatively low wiring cost. Particularly, Gaussian distributed connectivity in a network is found to achieve the best performance with the lowest wiring cost, in all examined connectivity models. Our results from this programme also suggest that a modular network with an appropriate configuration of Gaussian distributed connectivity, both internal to each module and across modules, can perform nearly as well as the Gaussian distributed non-modular network. Finally, a comparison between non-spiking and spiking associative memory models suggests that in terms of associative memory performance, the implication of connectivity seems to transcend the details of the actual neural models, that is, whether they are spiking or non-spiking neurons

    Design space exploration of associative memories using spiking neurons with respect to neuromorphic hardware implementations

    Get PDF
    Stöckel A. Design space exploration of associative memories using spiking neurons with respect to neuromorphic hardware implementations. Bielefeld: Universität Bielefeld; 2016.Artificial neural networks are well-established models for key functions of biological brains, such as low-level sensory processing and memory. In particular, networks of artificial spiking neurons emulate the time dynamics, high parallelisation and asynchronicity of their biological counterparts. Large scale hardware simulators for such networks – _neuromorphic_ computers – are developed as part of the Human Brain Project, with the ultimate goal to gain insights regarding the neural foundations of cognitive processes. In this thesis, we focus on one key cognitive function of biological brains, associative memory. We implement the well-understood Willshaw model for artificial spiking neural networks, thoroughly explore the design space for the implementation, provide fast design space exploration software and evaluate our implementation in software simulation as well as neuromorphic hardware. Thereby we provide an approach to manually or automatically infer viable parameters for an associative memory on different hardware and software platforms. The performance of the associative memory was found to vary significantly between individual neuromorphic hardware platforms and numerical simulations. The network is thus a suitable benchmark for neuromorphic systems

    Experimental demonstration of associative memory with memristive neural networks

    Get PDF
    When someone mentions the name of a known person we immediately recall her face and possibly many other traits. This is because we possess the so-called associative memory - the ability to correlate different memories to the same fact or event. Associative memory is such a fundamental and encompassing human ability (and not just human) that the network of neurons in our brain must perform it quite easily. The question is then whether electronic neural networks - electronic schemes that act somewhat similarly to human brains - can be built to perform this type of function. Although the field of neural networks has developed for many years, a key element, namely the synapses between adjacent neurons, has been lacking a satisfactory electronic representation. The reason for this is that a passive circuit element able to reproduce the synapse behaviour needs to remember its past dynamical history, store a continuous set of states, and be "plastic" according to the pre-synaptic and post-synaptic neuronal activity. Here we show that all this can be accomplished by a memory-resistor (memristor for short). In particular, by using simple and inexpensive off-the-shelf components we have built a memristor emulator which realizes all required synaptic properties. Most importantly, we have demonstrated experimentally the formation of associative memory in a simple neural network consisting of three electronic neurons connected by two memristor-emulator synapses. This experimental demonstration opens up new possibilities in the understanding of neural processes using memory devices, an important step forward to reproduce complex learning, adaptive and spontaneous behaviour with electronic neural networks

    Sparse neural networks with large learning diversity

    Full text link
    Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages, much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory
    • …
    corecore