1 research outputs found
On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis
We study the approximation properties and optimization dynamics of recurrent
neural networks (RNNs) when applied to learn input-output relationships in
temporal data. We consider the simple but representative setting of using
continuous-time linear RNNs to learn from data generated by linear
relationships. Mathematically, the latter can be understood as a sequence of
linear functionals. We prove a universal approximation theorem of such linear
functionals, and characterize the approximation rate and its relation with
memory. Moreover, we perform a fine-grained dynamical analysis of training
linear RNNs, which further reveal the intricate interactions between memory and
learning. A unifying theme uncovered is the non-trivial effect of memory, a
notion that can be made precise in our framework, on approximation and
optimization: when there is long term memory in the target, it takes a large
number of neurons to approximate it. Moreover, the training process will suffer
from slow downs. In particular, both of these effects become exponentially more
pronounced with memory - a phenomenon we call the "curse of memory". These
analyses represent a basic step towards a concrete mathematical understanding
of new phenomenon that may arise in learning temporal relationships using
recurrent architectures