714 research outputs found
Integer Echo State Networks: Hyperdimensional Reservoir Computing
We propose an approximation of Echo State Networks (ESN) that can be
efficiently implemented on digital hardware based on the mathematics of
hyperdimensional computing. The reservoir of the proposed Integer Echo State
Network (intESN) is a vector containing only n-bits integers (where n<8 is
normally sufficient for a satisfactory performance). The recurrent matrix
multiplication is replaced with an efficient cyclic shift operation. The intESN
architecture is verified with typical tasks in reservoir computing: memorizing
of a sequence of inputs; classifying time-series; learning dynamic processes.
Such an architecture results in dramatic improvements in memory footprint and
computational efficiency, with minimal performance loss.Comment: 10 pages, 10 figures, 1 tabl
Bayesian Recurrent Neural Network Models for Forecasting and Quantifying Uncertainty in Spatial-Temporal Data
Recurrent neural networks (RNNs) are nonlinear dynamical models commonly used
in the machine learning and dynamical systems literature to represent complex
dynamical or sequential relationships between variables. More recently, as deep
learning models have become more common, RNNs have been used to forecast
increasingly complicated systems. Dynamical spatio-temporal processes represent
a class of complex systems that can potentially benefit from these types of
models. Although the RNN literature is expansive and highly developed,
uncertainty quantification is often ignored. Even when considered, the
uncertainty is generally quantified without the use of a rigorous framework,
such as a fully Bayesian setting. Here we attempt to quantify uncertainty in a
more formal framework while maintaining the forecast accuracy that makes these
models appealing, by presenting a Bayesian RNN model for nonlinear
spatio-temporal forecasting. Additionally, we make simple modifications to the
basic RNN to help accommodate the unique nature of nonlinear spatio-temporal
data. The proposed model is applied to a Lorenz simulation and two real-world
nonlinear spatio-temporal forecasting applications
Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
A method is provided for designing and training noise-driven recurrent neural
networks as models of stochastic processes. The method unifies and generalizes
two known separate modeling approaches, Echo State Networks (ESN) and Linear
Inverse Modeling (LIM), under the common principle of relative entropy
minimization. The power of the new method is demonstrated on a stochastic
approximation of the El Nino phenomenon studied in climate research
Structured Sequence Modeling with Graph Convolutional Recurrent Networks
This paper introduces Graph Convolutional Recurrent Network (GCRN), a deep
learning model able to predict structured sequences of data. Precisely, GCRN is
a generalization of classical recurrent neural networks (RNN) to data
structured by an arbitrary graph. Such structured sequences can represent
series of frames in videos, spatio-temporal measurements on a network of
sensors, or random walks on a vocabulary graph for natural language modeling.
The proposed model combines convolutional neural networks (CNN) on graphs to
identify spatial structures and RNN to find dynamic patterns. We study two
possible architectures of GCRN, and apply the models to two practical problems:
predicting moving MNIST data, and modeling natural language with the Penn
Treebank dataset. Experiments show that exploiting simultaneously graph spatial
and dynamic information about data can improve both precision and learning
speed
- …