We improve the results by Siegelmann & Sontag (1995) by providing a novel and
parsimonious constructive mapping between Turing Machines and Recurrent
Artificial Neural Networks, based on recent developments of Nonlinear Dynamical
Automata. The architecture of the resulting R-ANNs is simple and elegant,
stemming from its transparent relation with the underlying NDAs. These
characteristics yield promise for developments in machine learning methods and
symbolic computation with continuous time dynamical systems. A framework is
provided to directly program the R-ANNs from Turing Machine descriptions, in
absence of network training. At the same time, the network can potentially be
trained to perform algorithmic tasks, with exciting possibilities in the
integration of approaches akin to Google DeepMind's Neural Turing Machines.Comment: 11 pages, 3 figure