62 research outputs found

    Dynamical Systems in Spiking Neuromorphic Hardware

    Get PDF
    Dynamical systems are universal computers. They can perceive stimuli, remember, learn from feedback, plan sequences of actions, and coordinate complex behavioural responses. The Neural Engineering Framework (NEF) provides a general recipe to formulate models of such systems as coupled sets of nonlinear differential equations and compile them onto recurrently connected spiking neural networks – akin to a programming language for spiking models of computation. The Nengo software ecosystem supports the NEF and compiles such models onto neuromorphic hardware. In this thesis, we analyze the theory driving the success of the NEF, and expose several core principles underpinning its correctness, scalability, completeness, robustness, and extensibility. We also derive novel theoretical extensions to the framework that enable it to far more effectively leverage a wide variety of dynamics in digital hardware, and to exploit the device-level physics in analog hardware. At the same time, we propose a novel set of spiking algorithms that recruit an optimal nonlinear encoding of time, which we call the Delay Network (DN). Backpropagation across stacked layers of DNs dramatically outperforms stacked Long Short-Term Memory (LSTM) networks—a state-of-the-art deep recurrent architecture—in accuracy and training time, on a continuous-time memory task, and a chaotic time-series prediction benchmark. The basic component of this network is shown to function on state-of-the-art spiking neuromorphic hardware including Braindrop and Loihi. This implementation approaches the energy-efficiency of the human brain in the former case, and the precision of conventional computation in the latter case

    Algorithms for massively parallel, event-based hardware

    Full text link

    Collective phenomena in networks of spiking neurons with synaptic delays

    Get PDF
    A prominent feature of the dynamics of large neuronal networks are the synchrony- driven collective oscillations generated by the interplay between synaptic coupling and synaptic delays. This thesis investigates the emergence of delay-induced oscillations in networks of heterogeneous spiking neurons. Building on recent theoretical advances in exact mean field reductions for neuronal networks, this work explores the dynamics and bifurcations of an exact firing rate model with various forms of synaptic delays. In parallel, the results obtained using the novel firing rate model are compared with extensive numerical simulations of large networks of spiking neurons, which confirm the existence of numerous synchrony-based oscillatory states. Some of these states are novel and display complex forms of partial synchronization and collective chaos. Given the well-known limitation of traditional firing rate models to describe synchrony-based oscillations, previous studies greatly overlooked many of the oscillatory states found here. Therefore, this thesis provides a unique exploration of the oscillatory scenarios found in neuronal networks due to the presence of delays, and may substantially extend the mathematical tools available for modeling the plethora of oscillations detected in electrical recordings of brain activity

    Configurable analog hardware for neuromorphic Bayesian inference and least-squares solutions

    Get PDF
    Sparse approximation is a Bayesian inference program with a wide number of signal processing applications, such as Compressed Sensing recovery used in medical imaging. Previous sparse coding implementations relied on digital algorithms whose power consumption and performance scale poorly with problem size, rendering them unsuitable for portable applications, and a bottleneck in high speed applications. A novel analog architecture, implementing the Locally Competitive Algorithm (LCA), was designed and programmed onto a Field Programmable Analog Arrays (FPAAs), using floating gate transistors to set the analog parameters. A network of 6 coefficients was demonstrated to converge to similar values as a digital sparse approximation algorithm, but with better power and performance scaling. A rate encoded spiking algorithm was then developed, which was shown to converge to similar values as the LCA. A second novel architecture was designed and programmed on an FPAA implementing the spiking version of the LCA with integrate and fire neurons. A network of 18 neurons converged on similar values as a digital sparse approximation algorithm, with even better performance and power efficiency than the non-spiking network. Novel algorithms were created to increase floating gate programming speed by more than two orders of magnitude, and reduce programming error from device mismatch. A new FPAA chip was designed and tested which allowed for rapid interfacing and additional improvements in accuracy. Finally, a neuromorphic chip was designed, containing 400 integrate and fire neurons, and capable of converging on a sparse approximation solution in 10 microseconds, over 1000 times faster than the best digital solution.Ph.D

    Harnessing Neural Dynamics as a Computational Resource

    Get PDF
    Researchers study nervous systems at levels of scale spanning several orders of magnitude, both in terms of time and space. While some parts of the brain are well understood at specific levels of description, there are few overarching theories that systematically bridge low-level mechanism and high-level function. The Neural Engineering Framework (NEF) is an attempt at providing such a theory. The NEF enables researchers to systematically map dynamical systems—corresponding to some hypothesised brain function—onto biologically constrained spiking neural networks. In this thesis, we present several extensions to the NEF that broaden both the range of neural resources that can be harnessed for spatiotemporal computation and the range of available biological constraints. Specifically, we suggest a method for harnessing the dynamics inherent in passive dendritic trees for computation, allowing us to construct single-layer spiking neural networks that, for some functions, achieve substantially lower errors than larger multi-layer networks. Furthermore, we suggest “temporal tuning” as a unifying approach to harnessing temporal resources for computation through time. This allows modellers to directly constrain networks to temporal tuning observed in nature, in ways not previously well-supported by the NEF. We then explore specific examples of neurally plausible dynamics using these techniques. In particular, we propose a new “information erasure” technique for constructing LTI systems generating temporal bases. Such LTI systems can be used to establish an optimal basis for spatiotemporal computation. We demonstrate how this captures “time cells” that have been observed throughout the brain. As well, we demonstrate the viability of our extensions by constructing an adaptive filter model of the cerebellum that successfully reproduces key features of eyeblink conditioning observed in neurobiological experiments. Outside the cognitive sciences, our work can help exploit resources available on existing neuromorphic computers, and inform future neuromorphic hardware design. In machine learning, our spatiotemporal NEF populations map cleanly onto the Legendre Memory Unit (LMU), a promising artificial neural network architecture for stream-to-stream processing that outperforms competing approaches. We find that one of our LTI systems derived through “information erasure” may serve as a computationally less expensive alternative to the LTI system commonly used in the LMU

    Complex and Adaptive Dynamical Systems: A Primer

    Full text link
    An thorough introduction is given at an introductory level to the field of quantitative complex system science, with special emphasis on emergence in dynamical systems based on network topologies. Subjects treated include graph theory and small-world networks, a generic introduction to the concepts of dynamical system theory, random Boolean networks, cellular automata and self-organized criticality, the statistical modeling of Darwinian evolution, synchronization phenomena and an introduction to the theory of cognitive systems. It inludes chapter on Graph Theory and Small-World Networks, Chaos, Bifurcations and Diffusion, Complexity and Information Theory, Random Boolean Networks, Cellular Automata and Self-Organized Criticality, Darwinian evolution, Hypercycles and Game Theory, Synchronization Phenomena and Elements of Cognitive System Theory.Comment: unformatted version of the textbook; published in Springer, Complexity Series (2008, second edition 2010

    18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems: Proceedings

    Get PDF
    Proceedings of the 18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems, which took place in Dresden, Germany, 26 – 28 May 2010.:Welcome Address ........................ Page I Table of Contents ........................ Page III Symposium Committees .............. Page IV Special Thanks ............................. Page V Conference program (incl. page numbers of papers) ................... Page VI Conference papers Invited talks ................................ Page 1 Regular Papers ........................... Page 14 Wednesday, May 26th, 2010 ......... Page 15 Thursday, May 27th, 2010 .......... Page 110 Friday, May 28th, 2010 ............... Page 210 Author index ............................... Page XII

    From thought to action

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.Includes bibliographical references.Systems engineering is rapidly assuming a prominent role in neuroscience that could unify scientific theories, experimental evidence, and medical development. In this three-part work, I study the neural representation of targets before reaching movements and the generation of prosthetic control signals through stochastic modeling and estimation. In the first part, I show that temporal and history dependence contributes to the representation of targets in the ensemble spiking activity of neurons in primate dorsal premotor cortex (PMd). Point process modeling of target representation suggests that local and possibly also distant neural interactions influence the spiking patterns observed in PMd. In the second part, I draw on results from surveillance theory to reconstruct reaching movements from neural activity related to the desired target and the path to that target. This approach combines movement planning and execution to surpass estimation with either target or path related neural activity alone. In the third part, I describe the principled design of brain-driven neural prosthetic devices as a filtering problem on interacting discrete and continuous random processes. This framework subsumes four canonical Bayesian approaches and supports emerging applications to neural prosthetic devices.(cont.) Results of a simulated reaching task predict that the method outperforms previous approaches in the control of arm position and velocity based on trajectory and endpoint mean squared error. These results form the starting point for a systems engineering approach to the design and interpretation of neuroscience experiments that can guide the development of technology for human-computer interaction and medical treatment.by Lakshminarayan Srinivasan.Ph.D
    corecore