25,849 research outputs found
Toward Abstraction from Multi-modal Data: Empirical Studies on Multiple Time-scale Recurrent Models
The abstraction tasks are challenging for multi- modal sequences as they
require a deeper semantic understanding and a novel text generation for the
data. Although the recurrent neural networks (RNN) can be used to model the
context of the time-sequences, in most cases the long-term dependencies of
multi-modal data make the back-propagation through time training of RNN tend to
vanish in the time domain. Recently, inspired from Multiple Time-scale
Recurrent Neural Network (MTRNN), an extension of Gated Recurrent Unit (GRU),
called Multiple Time-scale Gated Recurrent Unit (MTGRU), has been proposed to
learn the long-term dependencies in natural language processing. Particularly
it is also able to accomplish the abstraction task for paragraphs given that
the time constants are well defined. In this paper, we compare the MTRNN and
MTGRU in terms of its learning performances as well as their abstraction
representation on higher level (with a slower neural activation). This was done
by conducting two studies based on a smaller data- set (two-dimension time
sequences from non-linear functions) and a relatively large data-set
(43-dimension time sequences from iCub manipulation tasks with multi-modal
data). We conclude that gated recurrent mechanisms may be necessary for
learning long-term dependencies in large dimension multi-modal data-sets (e.g.
learning of robot manipulation), even when natural language commands was not
involved. But for smaller learning tasks with simple time-sequences, generic
version of recurrent models, such as MTRNN, were sufficient to accomplish the
abstraction task.Comment: Accepted by IJCNN 201
Evolino for recurrent support vector machines
Traditional Support Vector Machines (SVMs) need pre-wired finite time windows
to predict and classify time series. They do not have an internal state
necessary to deal with sequences involving arbitrary long-term dependencies.
Here we introduce a new class of recurrent, truly sequential SVM-like devices
with internal adaptive states, trained by a novel method called EVOlution of
systems with KErnel-based outputs (Evoke), an instance of the recent Evolino
class of methods. Evoke evolves recurrent neural networks to detect and
represent temporal dependencies while using quadratic programming/support
vector regression to produce precise outputs. Evoke is the first SVM-based
mechanism learning to classify a context-sensitive language. It also
outperforms recent state-of-the-art gradient-based recurrent neural networks
(RNNs) on various time series prediction tasks.Comment: 10 pages, 2 figure
- …