Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks

Abstract

Neural computations can be framed as dynamical processes, whereby the structure of the dynamics within a neural network is a direct reflection of the computations that the network performs. A key step in generating mechanistic interpretations within this computation through dynamics framework is to establish the link among network connectivity, dynamics, and computation. This link is only partly understood. Recent work has focused on producing algorithms for engineering artificial recurrent neural networks (RNN) with dynamics targeted to a specific goal manifold. Some of these algorithms require only a set of vectors tangent to the target manifold to be computed and thus provide a general method that can be applied to a diverse set of problems. Nevertheless, computing such vectors for an arbitrary manifold in a high-dimensional state space remains highly challenging, which in practice limits the applicability of this approach. Here we demonstrate how topology and differential geometry can be leveraged to simplify this task by first computing tangent vectors on a low-dimensional topological manifold and then embedding these in state space. The simplicity of this procedure greatly facilitates the creation of manifold-targeted RNNs, as well as the process of designing task-solving, on-manifold dynamics. This new method should enable the application of network engineering–based approaches to a wide set of problems in neuroscience and machine learning. Our description of how fundamental concepts from differential geometry can be mapped onto different aspects of neural dynamics is a further demonstration of how the language of differential geometry can enrich the conceptual framework for describing neural dynamics and computation

    Similar works