13,115 research outputs found
Edge of stability echo state networks
Echo State Networks (ESNs) are time-series processing models working under
the Echo State Property (ESP) principle. The ESP is a notion of stability that
imposes an asymptotic fading of the memory of the input. On the other hand, the
resulting inherent architectural bias of ESNs may lead to an excessive loss of
information, which in turn harms the performance in certain tasks with long
short-term memory requirements. With the goal of bringing together the fading
memory property and the ability to retain as much memory as possible, in this
paper we introduce a new ESN architecture, called the Edge of Stability Echo
State Network (ESN). The introduced ESN model is based on defining the
reservoir layer as a convex combination of a nonlinear reservoir (as in the
standard ESN), and a linear reservoir that implements an orthogonal
transformation. We provide a thorough mathematical analysis of the introduced
model, proving that the whole eigenspectrum of the Jacobian of the ESN map
can be contained in an annular neighbourhood of a complex circle of
controllable radius, and exploit this property to demonstrate that the
ESN's forward dynamics evolves close to the edge-of-chaos regime by design.
Remarkably, our experimental analysis shows that the newly introduced reservoir
model is able to reach the theoretical maximum short-term memory capacity. At
the same time, in comparison to standard ESN, ESN is shown to offer an
excellent trade-off between memory and nonlinearity, as well as a significant
improvement of performance in autoregressive nonlinear modeling
Neuroevolution on the Edge of Chaos
Echo state networks represent a special type of recurrent neural networks.
Recent papers stated that the echo state networks maximize their computational
performance on the transition between order and chaos, the so-called edge of
chaos. This work confirms this statement in a comprehensive set of experiments.
Furthermore, the echo state networks are compared to networks evolved via
neuroevolution. The evolved networks outperform the echo state networks,
however, the evolution consumes significant computational resources. It is
demonstrated that echo state networks with local connections combine the best
of both worlds, the simplicity of random echo state networks and the
performance of evolved networks. Finally, it is shown that evolution tends to
stay close to the ordered side of the edge of chaos.Comment: To appear in Proceedings of the Genetic and Evolutionary Computation
Conference 2017 (GECCO '17
A characterization of the Edge of Criticality in Binary Echo State Networks
Echo State Networks (ESNs) are simplified recurrent neural network models
composed of a reservoir and a linear, trainable readout layer. The reservoir is
tunable by some hyper-parameters that control the network behaviour. ESNs are
known to be effective in solving tasks when configured on a region in
(hyper-)parameter space called \emph{Edge of Criticality} (EoC), where the
system is maximally sensitive to perturbations hence affecting its behaviour.
In this paper, we propose binary ESNs, which are architecturally equivalent to
standard ESNs but consider binary activation functions and binary recurrent
weights. For these networks, we derive a closed-form expression for the EoC in
the autonomous case and perform simulations in order to assess their behavior
in the case of noisy neurons and in the presence of a signal. We propose a
theoretical explanation for the fact that the variance of the input plays a
major role in characterizing the EoC
Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere
Among the various architectures of Recurrent Neural Networks, Echo State
Networks (ESNs) emerged due to their simplified and inexpensive training
procedure. These networks are known to be sensitive to the setting of
hyper-parameters, which critically affect their behaviour. Results show that
their performance is usually maximized in a narrow region of hyper-parameter
space called edge of chaos. Finding such a region requires searching in
hyper-parameter space in a sensible way: hyper-parameter configurations
marginally outside such a region might yield networks exhibiting fully
developed chaos, hence producing unreliable computations. The performance gain
due to optimizing hyper-parameters can be studied by considering the
memory--nonlinearity trade-off, i.e., the fact that increasing the nonlinear
behavior of the network degrades its ability to remember past inputs, and
vice-versa. In this paper, we propose a model of ESNs that eliminates critical
dependence on hyper-parameters, resulting in networks that provably cannot
enter a chaotic regime and, at the same time, denotes nonlinear behaviour in
phase space characterised by a large memory of past inputs, comparable to the
one of linear networks. Our contribution is supported by experiments
corroborating our theoretical findings, showing that the proposed model
displays dynamics that are rich-enough to approximate many common nonlinear
systems used for benchmarking
Echo State Condition at the Critical Point
Recurrent networks with transfer functions that fulfill the Lipschitz
continuity with K=1 may be echo state networks if certain limitations on the
recurrent connectivity are applied. It has been shown that it is sufficient if
the largest singular value of the recurrent connectivity is smaller than 1. The
main achievement of this paper is a proof under which conditions the network is
an echo state network even if the largest singular value is one. It turns out
that in this critical case the exact shape of the transfer function plays a
decisive role in determining whether the network still fulfills the echo state
condition. In addition, several examples with one neuron networks are outlined
to illustrate effects of critical connectivity. Moreover, within the manuscript
a mathematical definition for a critical echo state network is suggested
- …