32 research outputs found

    Attractor Metadynamics in Adapting Neural Networks

    Full text link
    Slow adaption processes, like synaptic and intrinsic plasticity, abound in the brain and shape the landscape for the neural dynamics occurring on substantially faster timescales. At any given time the network is characterized by a set of internal parameters, which are adapting continuously, albeit slowly. This set of parameters defines the number and the location of the respective adiabatic attractors. The slow evolution of network parameters hence induces an evolving attractor landscape, a process which we term attractor metadynamics. We study the nature of the metadynamics of the attractor landscape for several continuous-time autonomous model networks. We find both first- and second-order changes in the location of adiabatic attractors and argue that the study of the continuously evolving attractor landscape constitutes a powerful tool for understanding the overall development of the neural dynamics

    reservoirpy: A Simple and Flexible Reservoir Computing Tool in Python

    Get PDF
    This paper presents reservoirpy, a Python library for Reservoir Computing (RC) models design and training, with a particular focus on Echo State Networks (ESNs). The library contains basic building blocks for a large variety of recurrent neural networks defined within the field of RC, along with both offline and online learning rules. Advanced features of the library enable compositions of RC building blocks to create complex "deep" models, delayed connections between these blocks to convey feedback signals, and empower users to create their own recurrent operators or neuronal connections topology. This tool is solely based on Python standard scientific packages such as numpy and scipy. It improves RC time efficiency with parallelism using joblib package, making it accessible to a large academic or industrial audience even with a low computational budget. Source code, tutorials and examples from the RC literature can be found at https://github.com/reservoirpy/reservoirpy while documentation can be found at https://reservoirpy.readthedocs.io/en/latest/?badge=lates

    Modeling neural plasticity in echo state networks for time series prediction

    Get PDF
    In this paper, we investigate the influence of neural plasticity on the learning performance of echo state networks (ESNs) and supervised learning algorithms in training readout connections for two time series prediction problems including the sunspot time series and the Mackey Glass chaotic system. We implement two different plasticity rules that are expected to improve the prediction performance, namely, anti-Oja learning rule and the Bienenstock-Cooper-Munro (BCM) learning rule combined with both offline and online learning of the readout connections. Our experimental results have demonstrated that the neural plasticity can more significantly enhance the learning in offline learning than in online learning

    Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning

    No full text
    Steil JJ. Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning. Neural Networks. 2007;20(3):353-364

    Unveiling the role of plasticity rules in reservoir computing

    Full text link
    Reservoir Computing (RC) is an appealing approach in Machine Learning that combines the high computational capabilities of Recurrent Neural Networks with a fast and easy training method. Likewise, successful implementation of neuro-inspired plasticity rules into RC artificial networks has boosted the performance of the original models. In this manuscript, we analyze the role that plasticity rules play on the changes that lead to a better performance of RC. To this end, we implement synaptic and non-synaptic plasticity rules in a paradigmatic example of RC model: the Echo State Network. Testing on nonlinear time series prediction tasks, we show evidence that improved performance in all plastic models are linked to a decrease of the pair-wise correlations in the reservoir, as well as a significant increase of individual neurons ability to separate similar inputs in their activity space. Here we provide new insights on this observed improvement through the study of different stages on the plastic learning. From the perspective of the reservoir dynamics, optimal performance is found to occur close to the so-called edge of instability. Our results also show that it is possible to combine different forms of plasticity (namely synaptic and non-synaptic rules) to further improve the performance on prediction tasks, obtaining better results than those achieved with single-plasticity models

    Functional identification of biological neural networks using reservoir adaptation for point processes

    Get PDF
    The complexity of biological neural networks does not allow to directly relate their biophysical properties to the dynamics of their electrical activity. We present a reservoir computing approach for functionally identifying a biological neural network, i.e. for building an artificial system that is functionally equivalent to the reference biological network. Employing feed-forward and recurrent networks with fading memory, i.e. reservoirs, we propose a point process based learning algorithm to train the internal parameters of the reservoir and the connectivity between the reservoir and the memoryless readout neurons. Specifically, the model is an Echo State Network (ESN) with leaky integrator neurons, whose individual leakage time constants are also adapted. The proposed ESN algorithm learns a predictive model of stimulus-response relations in in vitro and simulated networks, i.e. it models their response dynamics. Receiver Operating Characteristic (ROC) curve analysis indicates that these ESNs can imitate the response signal of a reference biological network. Reservoir adaptation improved the performance of an ESN over readout-only training methods in many cases. This also held for adaptive feed-forward reservoirs, which had no recurrent dynamics. We demonstrate the predictive power of these ESNs on various tasks with cultured and simulated biological neural networks
    corecore