10,959 research outputs found
The spectral radius remains a valid indicator of the echo state property for large reservoirs
In the field of Reservoir Computing, scaling the spectral radius of the weight matrix of a random recurrent neural network to below unity is a commonly used method to ensure the Echo State Property. Recently it has been shown that this condition is too weak. To overcome this problem, other more involved - sufficient conditions for the Echo State Property have been proposed. In this paper we provide a large-scale experimental verification of the Echo State Property for large recurrent neural networks with zero input and zero bias. Our main conclusion is that the spectral radius method remains a valid indicator of the Echo State Property; the probability that the Echo State Property does not hold, drops for larger networks with spectral radius below unity, which are the ones of practical interest
System Identification of multi-rotor UAVs using echo state networks
Controller design for aircraft with unusual configurations presents unique challenges, particularly in extracting valid mathematical models of the MRUAVs behaviour. System Identification is a collection of techniques for extracting an accurate mathematical model of a dynamic system from experimental input-output data. This can entail parameter identification only (known as grey-box modelling) or more generally full parameter/structural identification of the nonlinear mapping (known as black-box). In this paper we propose a new method for black-box identification of the non-linear dynamic model of a small MRUAV using Echo State Networks (ESN), a novel approach to train Recurrent Neural Networks (RNN)
Musical instrument mapping design with Echo State Networks
Echo State Networks (ESNs), a form of recurrent neural network developed in the field of Reservoir Computing, show significant potential for use as a tool in the design of mappings for digital musical instruments. They have, however, seldom been used in this area, so this paper explores their possible applications. This project contributes a new open source library, which was developed to allow ESNs to run in the Pure Data dataflow environment. Several use cases were explored, focusing on addressing current issues in mapping research. ESNs were found to work successfully in scenarios of pattern classification, multiparametric control, explorative mapping and the design of nonlinearities and uncontrol. 'Un-trained' behaviours are proposed, as augmentations to the conventional reservoir system that allow the player to introduce potentially interesting non-linearities and uncontrol into the reservoir. Interactive evolution style controls are proposed as strategies to help design these behaviours, which are otherwise dependent on arbitrary values and coarse global controls. A study on sound classification showed that ESNs could reliably differentiate between two drum sounds, and also generalise to other similar input. Following evaluation of the use cases, heuristics are proposed to aid the use of ESNs in computer music scenarios
The Asymptotic Performance of Linear Echo State Neural Networks
In this article, a study of the mean-square error (MSE) performance of linear
echo-state neural networks is performed, both for training and testing tasks.
Considering the realistic setting of noise present at the network nodes, we
derive deterministic equivalents for the aforementioned MSE in the limit where
the number of input data and network size both grow large. Specializing
then the network connectivity matrix to specific random settings, we further
obtain simple formulas that provide new insights on the performance of such
networks
Reservoir Computing Approach to Robust Computation using Unreliable Nanoscale Networks
As we approach the physical limits of CMOS technology, advances in materials
science and nanotechnology are making available a variety of unconventional
computing substrates that can potentially replace top-down-designed
silicon-based computing devices. Inherent stochasticity in the fabrication
process and nanometer scale of these substrates inevitably lead to design
variations, defects, faults, and noise in the resulting devices. A key
challenge is how to harness such devices to perform robust computation. We
propose reservoir computing as a solution. In reservoir computing, computation
takes place by translating the dynamics of an excited medium, called a
reservoir, into a desired output. This approach eliminates the need for
external control and redundancy, and the programming is done using a
closed-form regression problem on the output, which also allows concurrent
programming using a single device. Using a theoretical model, we show that both
regular and irregular reservoirs are intrinsically robust to structural noise
as they perform computation
Short-term Memory of Deep RNN
The extension of deep learning towards temporal data processing is gaining an
increasing research interest. In this paper we investigate the properties of
state dynamics developed in successive levels of deep recurrent neural networks
(RNNs) in terms of short-term memory abilities. Our results reveal interesting
insights that shed light on the nature of layering as a factor of RNN design.
Noticeably, higher layers in a hierarchically organized RNN architecture
results to be inherently biased towards longer memory spans even prior to
training of the recurrent connections. Moreover, in the context of Reservoir
Computing framework, our analysis also points out the benefit of a layered
recurrent organization as an efficient approach to improve the memory skills of
reservoir models.Comment: This is a pre-print (pre-review) version of the paper accepted for
presentation at the 26th European Symposium on Artificial Neural Networks,
Computational Intelligence and Machine Learning (ESANN), Bruges (Belgium),
25-27 April 201
- âŠ