14 research outputs found

    Reservoir Computing : traitement efficace de séries temporelles avec ReservoirPy

    Get PDF
    National audienceDe la météo au langage, extraire l'information de flux de données est un enjeu primordial en Intelligence Artificielle. Le Reservoir Computing (RC) est particulièrement adapté pour bien prendre en compte ces dynamiques temporelles. C'est un paradigme d'apprentissage automatique sur des données séquentielles où un réseau de neurones artificiel n'est que partiellement entrainé. Un des intérêts majeurs de ces réseaux de neurones récurrents est leur coût computationnel réduit et la possibilité d'apprendre aussi bien "en-ligne" que "horsligne". Leurs applications sont très variées, qu'il s'agisse de prédiction/génération de séries chaotiques ou de discrimination de séquences audio, comme la reconnaissance de chants d'oiseaux. Nous verrons les aspects théoriques du RC : comment ce "réservoir de calculs" fonctionne grâce à des projections aléatoires en grandes dimensions, et s'apparente ainsi à des Machines à Vecteurs Supports (SVM) temporelles. Nous présenterons également ReservoirPy : une bibliothèque Python à la fois simple et efficace basée sur la pile logicielle scientifique de Python (Numpy, Scipy, Matplotlib). ReservoirPy met l'accent sur les Echo State Networks (ESN), l'instance la plus connue de RC. Elle permet la conception d'architectures développées dans la littérature, allant des plus classiques aux plus complexes. ReservoirPy embarque plusieurs règles d'apprentissage (online et offline), une implémentation distribuée de l'entraînement des ESN, la possibilité de créer des réseaux hiérarchiques comportant des boucles de feedback complexes, et des outils d'aide à l'optimisation des hyperparamètres. La documentation, des tutoriels, des exemples et des jeux de données sont disponibles sur la page GitHub de ReservoirPy : github.com/reservoirp

    reservoirpy: A Simple and Flexible Reservoir Computing Tool in Python

    Get PDF
    This paper presents reservoirpy, a Python library for Reservoir Computing (RC) models design and training, with a particular focus on Echo State Networks (ESNs). The library contains basic building blocks for a large variety of recurrent neural networks defined within the field of RC, along with both offline and online learning rules. Advanced features of the library enable compositions of RC building blocks to create complex "deep" models, delayed connections between these blocks to convey feedback signals, and empower users to create their own recurrent operators or neuronal connections topology. This tool is solely based on Python standard scientific packages such as numpy and scipy. It improves RC time efficiency with parallelism using joblib package, making it accessible to a large academic or industrial audience even with a low computational budget. Source code, tutorials and examples from the RC literature can be found at https://github.com/reservoirpy/reservoirpy while documentation can be found at https://reservoirpy.readthedocs.io/en/latest/?badge=lates

    ReservoirPy: Efficient Training of Recurrent Neural Networks for Timeseries Processing

    Get PDF
    International audienceReservoirPy is a simple user-friendly library based on Python scientific modules. It provides a flexible interface to implement efficient Reservoir Computing (RC) [2] architectures with a particular focus on Echo State Networks (ESN) [1]. Advanced features of ReservoirPy allow to improve computation time efficiency on a simple laptop compared to basic Python implementation. Some of its features are: offline and online training, parallel implementation, sparse matrix computation, fast spectral initialization, advanced learning rules (e.g. Intrinsic Plasticity) etc. It also makes possible to easily create complex architectures with multiple reservoirs (e.g. deep reservoirs), readouts, and complex feedback loops. Moreover, graphical tools are included to easily explore hyperparameters with the help of the hyperopt library. It includes several tutorials exploring exotic architectures and examples of scientific papers reproduction

    Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs

    Get PDF
    International audienceDomestic canaries produce complex vocal patterns embed- ded in various levels of abstraction. Studying such temporal organization is of particular relevance to understand how animal brains represent and process vocal inputs such as language. However, this requires a large amount of annotated data. We propose a fast and easy-to-train trans- ducer model based on RNN architectures to automate parts of the anno- tation process. This is similar to a speech recognition task. We demon- strate that RNN architectures can be efficiently applied on spectral fea- tures (MFCC) to annotate songs at time frame level and at phrase level. We achieved around 95% accuracy at frame level on particularly complex canary songs, and ESNs achieved around 5% of word error rate (WER) at phrase level. Moreover, we are able to build this model using only around 13 to 20 minutes of annotated songs. Training time takes only 35 seconds using 2 hours and 40 minutes of data for the ESN, allowing to quickly run experiments without the need of powerful hardware

    Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters

    Get PDF
    International audienceIn learning systems, hyperparameters are parameters that are not learned but need to be set a priori. In Reservoir Computing, there are several parameters that needs to be set a priori depending on the task. Newcomers to Reservoir Computing cannot have a good intuition on which hyperparameters to tune and how to tune them. For instance, beginners often explore the reservoir sparsity, but in practice this parameter is not of high influence on performance for ESNs. Most importantly, many authors keep doing suboptimal hyperparameter searches: using grid search as a tool to explore more than two hyperparameters, while restraining the spectral radius to be below unity. In this short paper, we give some suggestions, intuitions, and give a general method to find robust hyperparameters while understanding their influence on perfor- mance. We also provide a graphical interface (included in ReservoirPy) in order to make this hyperparameter search more intuitive. Finally, we discuss some potential refinements of the proposed method

    Create Efficient and Complex Reservoir Computing Architectures with ReservoirPy

    Get PDF
    International audienceReservoir Computing (RC) is a type of recurrent neural network (RNNs) where learning is restricted to the output weights. RCs are often considered as temporal Support Vector Machines (SVMs) for the way they project inputs onto dynamic non-linear high-dimensional representations. This paradigm, mainly represented by Echo State Networks (ESNs), has been successfully applied on a wide variety of tasks, from time series forecasting to sequence generation. They offer de facto a fast, simple yet efficient way to train RNNs. We present in this paper a library that facilitates the creation of RC architectures, from simplest to most complex, based on the Python scientific stack (NumPy, Scipy). This library offers memory and time efficient implementations for both online and offline training paradigms, such as FORCE learning or parallel ridge regression. The flexibility of the API allows to quickly design ESNs including re-usable and customizable components. It enables to build models such as DeepESNs as well as other advanced architectures with complex connectivity between multiple reservoirs with feedback loops. Extensive documentation and tutorials both for newcomers and experts are provided through GitHub and ReadTheDocs websites. The paper introduces the main concepts supporting the library, illustrated with code examples covering popular RC techniques from the literature. We argue that such flexible dedicated library will ease the creation of more advanced architectures while guarantying their correct implementation and reproducibility across the RC community

    What does the Canary Say? Low-Dimensional GAN Applied to Birdsong

    Get PDF
    The generation of speech, and more generally com- plex animal vocalizations, by artificial systems is a difficult problem. Generative Adversarial Networks (GANs) have shown very good abilities for generating images, and more recently sounds. While current GANs have high-dimensional latent spaces, complex vocalizations could in principle be generated through a low-dimensional latent space, easing the visualization and evaluation of latent representations. In this study, we aim to test the ability of a previously developed GAN, called WaveGAN, to reproduce canary syllables while drastically reducing the latent space dimension. We trained WaveGAN on a large dataset of canary syllables (16000 renditions of 16 different syllable types) and varied the latent space dimensions from 1 to 6. The sounds produced by the generator are evaluated using a RNN- based classifier. This quantitative evaluation is paired with a qualitative evaluation of the GAN productions across training epochs and latent dimensions. Altogether, our results show that a 3-dimensional latent space is enough to produce all syllable types in the repertoire with a quality often indistinguishable from real canary vocalizations. Importantly, we show that the 3-dimensional GAN generalizes by interpolating between the various syllable types. We rely on UMAP [1] to qualitatively show the similarities between training and generated data, and between the generated syllables and the interpolations produced. We discuss how our study may provide tools to train simple models of vocal production and/or learning. Indeed, while the RNN- based classifier provides a biologically realistic representation of the auditory network processing vocalizations, the small dimensional GAN may be used for the production of complex vocal repertoires

    ReservoirPy: Efficient Training of Recurrent Neural Networks for Timeseries Processing

    Get PDF
    International audienceReservoirPy is a simple user-friendly library based on Python scientific modules. It provides a flexible interface to implement efficient Reservoir Computing (RC) [2] architectures with a particular focus on Echo State Networks (ESN) [1]. Advanced features of ReservoirPy allow to improve computation time efficiency on a simple laptop compared to basic Python implementation. Some of its features are: offline and online training, parallel implementation, sparse matrix computation, fast spectral initialization, advanced learning rules (e.g. Intrinsic Plasticity) etc. It also makes possible to easily create complex architectures with multiple reservoirs (e.g. deep reservoirs), readouts, and complex feedback loops. Moreover, graphical tools are included to easily explore hyperparameters with the help of the hyperopt library. It includes several tutorials exploring exotic architectures and examples of scientific papers reproduction

    Create Efficient and Complex Reservoir Computing Architectures with ReservoirPy

    Get PDF
    International audienceReservoir Computing (RC) is a type of recurrent neural network (RNNs) where learning is restricted to the output weights. RCs are often considered as temporal Support Vector Machines (SVMs) for the way they project inputs onto dynamic non-linear high-dimensional representations. This paradigm, mainly represented by Echo State Networks (ESNs), has been successfully applied on a wide variety of tasks, from time series forecasting to sequence generation. They offer de facto a fast, simple yet efficient way to train RNNs. We present in this paper a library that facilitates the creation of RC architectures, from simplest to most complex, based on the Python scientific stack (NumPy, Scipy). This library offers memory and time efficient implementations for both online and offline training paradigms, such as FORCE learning or parallel ridge regression. The flexibility of the API allows to quickly design ESNs including re-usable and customizable components. It enables to build models such as DeepESNs as well as other advanced architectures with complex connectivity between multiple reservoirs with feedback loops. Extensive documentation and tutorials both for newcomers and experts are provided through GitHub and ReadTheDocs websites. The paper introduces the main concepts supporting the library, illustrated with code examples covering popular RC techniques from the literature. We argue that such flexible dedicated library will ease the creation of more advanced architectures while guarantying their correct implementation and reproducibility across the RC community
    corecore