12,105 research outputs found
Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere
Among the various architectures of Recurrent Neural Networks, Echo State
Networks (ESNs) emerged due to their simplified and inexpensive training
procedure. These networks are known to be sensitive to the setting of
hyper-parameters, which critically affect their behaviour. Results show that
their performance is usually maximized in a narrow region of hyper-parameter
space called edge of chaos. Finding such a region requires searching in
hyper-parameter space in a sensible way: hyper-parameter configurations
marginally outside such a region might yield networks exhibiting fully
developed chaos, hence producing unreliable computations. The performance gain
due to optimizing hyper-parameters can be studied by considering the
memory--nonlinearity trade-off, i.e., the fact that increasing the nonlinear
behavior of the network degrades its ability to remember past inputs, and
vice-versa. In this paper, we propose a model of ESNs that eliminates critical
dependence on hyper-parameters, resulting in networks that provably cannot
enter a chaotic regime and, at the same time, denotes nonlinear behaviour in
phase space characterised by a large memory of past inputs, comparable to the
one of linear networks. Our contribution is supported by experiments
corroborating our theoretical findings, showing that the proposed model
displays dynamics that are rich-enough to approximate many common nonlinear
systems used for benchmarking
Complex Neuro-Cognitive Systems
Cognitive functions such as a perception, thinking and acting are based on the working of the brain, one of the most complex systems we know. The traditional scientific methodology, however, has proved to be not sufficient to understand the relation between brain and cognition. The aim of this paper is to review an alternative methodology – nonlinear dynamical analysis – and to demonstrate its benefit\ud
for cognitive neuroscience in cases when the usual reductionist method fails
Forecasting high waters at Venice Lagoon using chaotic time series analisys and nonlinear neural netwoks
Time series analysis using nonlinear dynamics systems theory and multilayer neural networks models have been applied to the time sequence of water level data recorded every hour at 'Punta della Salute' from Venice Lagoon during the years 1980-1994. The first method is based on the reconstruction of the state space attractor using time delay embedding vectors and on the characterisation of invariant properties which define its dynamics. The results suggest the existence of a low dimensional chaotic attractor with a Lyapunov dimension, DL, of around 6.6 and a predictability between 8 and 13 hours ahead. Furthermore, once the attractor has been reconstructed it is possible to make predictions by mapping local-neighbourhood to local-neighbourhood in the reconstructed phase space. To compare the prediction results with another nonlinear method, two nonlinear autoregressive models (NAR) based on multilayer feedforward neural networks have been developed. From the study, it can be observed that nonlinear forecasting produces adequate results for the 'normal' dynamic behaviour of the water level of Venice Lagoon, outperforming linear algorithms, however, both methods fail to forecast the 'high water' phenomenon more than 2-3 hours ahead.Publicad
- …