18 research outputs found
Simulation and implementation of novel deep learning hardware architectures for resource constrained devices
Corey Lammie designed mixed signal memristive-complementary metal–oxide–semiconductor (CMOS) and field programmable gate arrays (FPGA) hardware architectures, which were used to reduce the power and resource requirements of Deep Learning (DL) systems; both during inference and training. Disruptive design methodologies, such as those explored in this thesis, can be used to facilitate the design of next-generation DL systems
Recommended from our members
Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition
During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input
Decoding Neural Signals with Computational Models: A Systematic Review of Invasive BMI
There are significant milestones in modern human's civilization in which
mankind stepped into a different level of life with a new spectrum of
possibilities and comfort. From fire-lighting technology and wheeled wagons to
writing, electricity and the Internet, each one changed our lives dramatically.
In this paper, we take a deep look into the invasive Brain Machine Interface
(BMI), an ambitious and cutting-edge technology which has the potential to be
another important milestone in human civilization. Not only beneficial for
patients with severe medical conditions, the invasive BMI technology can
significantly impact different technologies and almost every aspect of human's
life. We review the biological and engineering concepts that underpin the
implementation of BMI applications. There are various essential techniques that
are necessary for making invasive BMI applications a reality. We review these
through providing an analysis of (i) possible applications of invasive BMI
technology, (ii) the methods and devices for detecting and decoding brain
signals, as well as (iii) possible options for stimulating signals into human's
brain. Finally, we discuss the challenges and opportunities of invasive BMI for
further development in the area.Comment: 51 pages, 14 figures, review articl