4 research outputs found
Empirical Analysis of the Necessary and Sufficient Conditions of the Echo State Property
The Echo State Network (ESN) is a specific recurrent network, which has
gained popularity during the last years. The model has a recurrent network
named reservoir, that is fixed during the learning process. The reservoir is
used for transforming the input space in a larger space. A fundamental property
that provokes an impact on the model accuracy is the Echo State Property (ESP).
There are two main theoretical results related to the ESP. First, a sufficient
condition for the ESP existence that involves the singular values of the
reservoir matrix. Second, a necessary condition for the ESP. The ESP can be
violated according to the spectral radius value of the reservoir matrix. There
is a theoretical gap between these necessary and sufficient conditions. This
article presents an empirical analysis of the accuracy and the projections of
reservoirs that satisfy this theoretical gap. It gives some insights about the
generation of the reservoir matrix. From previous works, it is already known
that the optimal accuracy is obtained near to the border of stability control
of the dynamics. Then, according to our empirical results, we can see that this
border seems to be closer to the sufficient conditions than to the necessary
conditions of the ESP.Comment: 23 pages, 14 figures, accepted paper for the IEEE IJCNN, 201
Input-to-State Representation in linear reservoirs dynamics
Reservoir computing is a popular approach to design recurrent neural
networks, due to its training simplicity and approximation performance. The
recurrent part of these networks is not trained (e.g., via gradient descent),
making them appealing for analytical studies by a large community of
researchers with backgrounds spanning from dynamical systems to neuroscience.
However, even in the simple linear case, the working principle of these
networks is not fully understood and their design is usually driven by
heuristics. A novel analysis of the dynamics of such networks is proposed,
which allows the investigator to express the state evolution using the
controllability matrix. Such a matrix encodes salient characteristics of the
network dynamics; in particular, its rank represents an input-indepedent
measure of the memory capacity of the network. Using the proposed approach, it
is possible to compare different reservoir architectures and explain why a
cyclic topology achieves favourable results as verified by practitioners
Learn to Synchronize, Synchronize to Learn
In recent years, the machine learning community has seen a continuous growing
interest in research aimed at investigating dynamical aspects of both training
procedures and perfected recurrent models. Of particular interest among
recurrent neural networks we have the Reservoir Computing (RC) paradigm
characterized by conceptual simplicity and fast training scheme. Yet, the
guiding principles under which RC operates are only partially understood. In
this work, we study the properties behind learning dynamical systems and
propose a new guiding principle based on Generalized Synchronization (GS),
granting to learn a generic task with RC architectures. We show that the
well-known Echo State Property (ESP) implies and is implied by GS, so that
theoretical results derived from the ESP still hold when GS does. However, by
using GS one can profitably study the RC learning procedure by linking the
reservoir dynamics with the readout training. Notably, this allows us to shed
light on the interplay between the input encoding performed by the reservoir
and the output produced by the readout optimized for the task at hand. In
addition, we show that - as opposed to the ESP - satisfaction of the GS can be
measured by means of the Mutual False Nearest Neighbors index, which makes
effective to practitioners theoretical derivations