7 research outputs found

    Integer Echo State Networks: Hyperdimensional Reservoir Computing

    Full text link
    We propose an approximation of Echo State Networks (ESN) that can be efficiently implemented on digital hardware based on the mathematics of hyperdimensional computing. The reservoir of the proposed Integer Echo State Network (intESN) is a vector containing only n-bits integers (where n<8 is normally sufficient for a satisfactory performance). The recurrent matrix multiplication is replaced with an efficient cyclic shift operation. The intESN architecture is verified with typical tasks in reservoir computing: memorizing of a sequence of inputs; classifying time-series; learning dynamic processes. Such an architecture results in dramatic improvements in memory footprint and computational efficiency, with minimal performance loss.Comment: 10 pages, 10 figures, 1 tabl

    Perceptron theory can predict the accuracy of neural networks

    Get PDF
    Multilayer neural networks set the current state of the art for many technical classification problems. But, these networks are still, essentially, black boxes in terms of analyzing them and predicting their performance. Here, we develop a statistical theory for the one-layer perceptron and show that it can predict performances of a surprisingly large variety of neural networks with different architectures. A general theory of classification with perceptrons is developed by generalizing an existing theory for analyzing reservoir computing models and connectionist models for symbolic reasoning known as vector symbolic architectures. Our statistical theory offers three formulas leveraging the signal statistics with increasing detail. The formulas are analytically intractable, but can be evaluated numerically. The description level that captures maximum details requires stochastic sampling methods. Depending on the network model, the simpler formulas already yield high prediction accuracy. The quality of the theory predictions is assessed in three experimental settings, a memorization task for echo state networks (ESNs) from reservoir computing literature, a collection of classification datasets for shallow randomly connected networks, and the ImageNet dataset for deep convolutional neural networks. We find that the second description level of the perceptron theory can predict the performance of types of ESNs, which could not be described previously. Furthermore, the theory can predict deep multilayer neural networks by being applied to their output layer. While other methods for prediction of neural networks performance commonly require to train an estimator model, the proposed theory requires only the first two moments of the distribution of the postsynaptic sums in the output neurons. Moreover, the perceptron theory compares favorably to other methods that do not rely on training an estimator model

    Activity Mapping the Leech Nervous System

    No full text
    Neural circuits represent and process information using large numbers of component neurons. In this thesis, I describe the current theories of how information is processed by nervous systems, biophysical models of the basic mechanisms of population coding, and experimental techniques and computational algorithms to collect and synthesize the data necessary to understand neural computation in simple nervous systems. The leech nervous system is an ideal system for study, because it is readily accessible and consists of core neural structure, the ganglion, built of only a few hundred neurons. Even though this system is simple compared to mammalian nervous systems, it is still remarkably complex. To understand this complexity and make sense of the patterns of neurons, large-scale recordings of neural activity are required. We have developed a new Voltage-Sensitive Dye to record from a significant fraction of the nervous system simultaneously during behavioral states. This type of large-scale data presents entirely new challenges to overcome, and we developed computational tools to visualize and stitch-together these large-scale recordings. By imaging from a ganglion in several animals and recording neural responses during several different behaviors, we have developed a system for scalable and rapid identification of dozens of individual neurons. With these tools and techniques, we have mapped out the activity of a significant fraction of the leech's nervous system and have identified dozens of novel cells. Many of these cells are part of two canonical networks in the leech nervous system: the swim and preparatory networks. We further show that the preparatory network is mediated by the S cell, and we use computationally guided electrophysiology to target and verify that the S cell drives the activity of other cells in the preparatory network. These tools enable us to study the nervous system at scale for the first time, and we have mapped out the roles of a significant fraction of the neurons during the production of behaviors. These are just the first steps necessary to build a complete picture of how leech neurons produce behavior and make decisions

    Neuromorphic Visual Scene Understanding with Resonator Networks (in brief)

    No full text
    Inferring the position of objects and their rigid transformations is still an open problem in visual scene understanding. Here we propose a neuromorphic framework that poses scene understanding as a factorization problem and uses a resonator network to extract object identities and their transformations. The framework uses vector binding operations to produce generative image models in which binding acts as the equivariant operation for geometric transformations. A scene can therefore be described as a sum of vector products, which in turn can be efficiently factorized by a resonator network to infer objects and their poses. We also describe a hierarchical resonator network that enables the definition of a partitioned architecture in which vector binding is equivariant for horizontal and vertical translation within one partition, and for rotation and scaling within the other partition. We demonstrate our approach using synthetic scenes composed of simple 2D shapes undergoing rigid geometric transformations and color changes

    High-Dimensional Computing as a Nanoscalable Paradigm

    No full text
    corecore