19 research outputs found

    A combined experimental and computational approach to investigate emergent network dynamics based on large-scale neuronal recordings

    Get PDF
    Sviluppo di un approccio integrato computazionale-sperimentale per lo studio di reti neuronali mediante registrazioni elettrofisiologich

    Plasticity and Adaptation in Neuromorphic Biohybrid Systems

    Get PDF
    Neuromorphic systems take inspiration from the principles of biological information processing to form hardware platforms that enable the large-scale implementation of neural networks. The recent years have seen both advances in the theoretical aspects of spiking neural networks for their use in classification and control tasks and a progress in electrophysiological methods that is pushing the frontiers of intelligent neural interfacing and signal processing technologies. At the forefront of these new technologies, artificial and biological neural networks are tightly coupled, offering a novel \u201cbiohybrid\u201d experimental framework for engineers and neurophysiologists. Indeed, biohybrid systems can constitute a new class of neuroprostheses opening important perspectives in the treatment of neurological disorders. Moreover, the use of biologically plausible learning rules allows forming an overall fault-tolerant system of co-developing subsystems. To identify opportunities and challenges in neuromorphic biohybrid systems, we discuss the field from the perspectives of neurobiology, computational neuroscience, and neuromorphic engineering. \ua9 2020 The Author(s

    Neuromorphic Electronic Circuits for Building Autonomous Cognitive Systems

    Get PDF
    Chicca E, Stefanini F, Bartolozzi C, Indiveri G. Neuromorphic Electronic Circuits for Building Autonomous Cognitive Systems. In: Proceedings of the IEEE. Proceedings of the IEEE. Vol 102. Piscataway, NJ: IEEE; 2014: 1367-1388.Several analog and digital brain-inspired electronic systems have been recently proposed as dedicated solutions for fast simulations of spiking neural networks. While these architectures are useful for exploring the computational properties of large-scale models of the nervous system, the challenge of building low-power compact physical artifacts that can behave intelligently in the real world and exhibit cognitive abilities still remains open. In this paper, we propose a set of neuromorphic engineering solutions to address this challenge. In particular, we review neuromorphic circuits for emulating neural and synaptic dynamics in real time and discuss the role of biophysically realistic temporal dynamics in hardware neural processing architectures; we review the challenges of realizing spike-based plasticity mechanisms in real physical systems and present examples of analog electronic circuits that implement them; we describe the computational properties of recurrent neural networks and show how neuromorphic winner-take-all circuits can implement working-memory and decision-making mechanisms. We validate the neuromorphic approach proposed with experimental results obtained from our own circuits and systems, and argue how the circuits and networks presented in this work represent a useful set of components for efficiently and elegantly implementing neuromorphic cognition

    Exploring the potential of brain-inspired computing

    Get PDF
    The gap between brains and computers regarding both their cognitive capability and power efficiency is remarkably huge. Brains process information massively in parallel and its constituents are intrinsically self-organizing, while in digital computers the execution of instructions is deterministic and rather serial. The recent progress in the development of dedicated hardware systems implementing physical models of neurons and synapses enables to efficiently emulate spiking neural networks. In this work, we verify the design and explore the potential for brain-inspired computing of such an analog neuromorphic system, called Spikey. We demonstrate the versatility of this highly configurable substrate by the implementation of a rich repertoire of network models, including models for signal propagation and enhancement, general purpose classifiers, cortical models and decorrelating feedback systems. Network emulations on Spikey are highly accelerated and consume less than 1 nJ per synaptic transmission. The Spikey system, hence, outperforms modern desktop computers in terms of fast and efficient network simulations closing the gap to brains. During this thesis the stability, performance and user-friendliness of the Spikey system was improved integrating it into the neuroscientific tool chain and making it available for the community. The implementation of networks suitable to solve everyday tasks, like object or speech recognition, qualifies this technology to be an alternative to conventional computers. Considering the compactness, computational capability and power efficiency, neuromorphic systems may qualify as a valuable complement to classical computation

    Toward Reflective Spiking Neural Networks Exploiting Memristive Devices

    Get PDF
    The design of modern convolutional artificial neural networks (ANNs) composed of formal neurons copies the architecture of the visual cortex. Signals proceed through a hierarchy, where receptive fields become increasingly more complex and coding sparse. Nowadays, ANNs outperform humans in controlled pattern recognition tasks yet remain far behind in cognition. In part, it happens due to limited knowledge about the higher echelons of the brain hierarchy, where neurons actively generate predictions about what will happen next, i.e., the information processing jumps from reflex to reflection. In this study, we forecast that spiking neural networks (SNNs) can achieve the next qualitative leap. Reflective SNNs may take advantage of their intrinsic dynamics and mimic complex, not reflex-based, brain actions. They also enable a significant reduction in energy consumption. However, the training of SNNs is a challenging problem, strongly limiting their deployment. We then briefly overview new insights provided by the concept of a high-dimensional brain, which has been put forward to explain the potential power of single neurons in higher brain stations and deep SNN layers. Finally, we discuss the prospect of implementing neural networks in memristive systems. Such systems can densely pack on a chip 2D or 3D arrays of plastic synaptic contacts directly processing analog information. Thus, memristive devices are a good candidate for implementing in-memory and in-sensor computing. Then, memristive SNNs can diverge from the development of ANNs and build their niche, cognitive, or reflective computations

    26th Annual Computational Neuroscience Meeting (CNS*2017): Part 3 - Meeting Abstracts - Antwerp, Belgium. 15–20 July 2017

    Get PDF
    This work was produced as part of the activities of FAPESP Research,\ud Disseminations and Innovation Center for Neuromathematics (grant\ud 2013/07699-0, S. Paulo Research Foundation). NLK is supported by a\ud FAPESP postdoctoral fellowship (grant 2016/03855-5). ACR is partially\ud supported by a CNPq fellowship (grant 306251/2014-0)

    Steep, Spatially Graded Recruitment of Feedback Inhibition by Sparse Dentate Granule Cell Activity

    Get PDF
    The dentate gyrus of the hippocampus is thought to subserve important physiological functions, such as 'pattern separation'. In chronic temporal lobe epilepsy, the dentate gyrus constitutes a strong inhibitory gate for the propagation of seizure activity into the hippocampus proper. Both examples are thought to depend critically on a steep recruitment of feedback inhibition by active dentate granule cells. Here, I used two complementary experimental approaches to quantitatively investigate the recruitment of feedback inhibition in the dentate gyrus. I showed that the activity of approximately 4% of granule cells suffices to recruit maximal feedback inhibition within the local circuit. Furthermore, the inhibition elicited by a local population of granule cells is distributed non-uniformly over the extent of the granule cell layer. Locally and remotely activated inhibition differ in several key aspects, namely their amplitude, recruitment, latency and kinetic properties. Finally, I show that net feedback inhibition facilitates during repetitive stimulation. Taken together, these data provide the first quantitative functional description of a canonical feedback inhibitory microcircuit motif. They establish that sparse granule cell activity, within the range observed in-vivo, steeply recruits spatially and temporally graded feedback inhibition
    corecore