24,860 research outputs found
Structured Sequence Modeling with Graph Convolutional Recurrent Networks
This paper introduces Graph Convolutional Recurrent Network (GCRN), a deep
learning model able to predict structured sequences of data. Precisely, GCRN is
a generalization of classical recurrent neural networks (RNN) to data
structured by an arbitrary graph. Such structured sequences can represent
series of frames in videos, spatio-temporal measurements on a network of
sensors, or random walks on a vocabulary graph for natural language modeling.
The proposed model combines convolutional neural networks (CNN) on graphs to
identify spatial structures and RNN to find dynamic patterns. We study two
possible architectures of GCRN, and apply the models to two practical problems:
predicting moving MNIST data, and modeling natural language with the Penn
Treebank dataset. Experiments show that exploiting simultaneously graph spatial
and dynamic information about data can improve both precision and learning
speed
Convolutional Neural Networks Via Node-Varying Graph Filters
Convolutional neural networks (CNNs) are being applied to an increasing
number of problems and fields due to their superior performance in
classification and regression tasks. Since two of the key operations that CNNs
implement are convolution and pooling, this type of networks is implicitly
designed to act on data described by regular structures such as images.
Motivated by the recent interest in processing signals defined in irregular
domains, we advocate a CNN architecture that operates on signals supported on
graphs. The proposed design replaces the classical convolution not with a
node-invariant graph filter (GF), which is the natural generalization of
convolution to graph domains, but with a node-varying GF. This filter extracts
different local features without increasing the output dimension of each layer
and, as a result, bypasses the need for a pooling stage while involving only
local operations. A second contribution is to replace the node-varying GF with
a hybrid node-varying GF, which is a new type of GF introduced in this paper.
While the alternative architecture can still be run locally without requiring a
pooling stage, the number of trainable parameters is smaller and can be
rendered independent of the data dimension. Tests are run on a synthetic source
localization problem and on the 20NEWS dataset.Comment: Submitted to DSW 2018 (IEEE Data Science Workshop
- …