26,330 research outputs found

    Mixed signal VLSI circuit implementation of the cortical microcircuit models

    Get PDF
    This thesis proposes a novel set of generic and compact biologically plausible VLSI (Very Large Scale Integration) neural circuits, suitable for implementing a parallel VLSI network that closely resembles the function of a small-scale neocortical network. The proposed circuits include a cortical neuron, two different long-term plastic synapses and four different short-term plastic synapses. These circuits operate in accelerated-time, where the time scale of neural responses is approximately three to four orders of magnitude faster than the biological-time scale of the neuronal activities, providing higher computational throughput in computing neural dynamics. Further, a novel biological-time cortical neuron circuit with similar dynamics as of the accelerated-time neuron is proposed to demonstrate the feasibility of migrating accelerated-time circuits into biological-time circuits. The fabricated accelerated-time VLSI neuron circuit is capable of replicating distinct firing patterns such as regular spiking, fast spiking, chattering and intrinsic bursting, by tuning two external voltages. It reproduces biologically plausible action potentials. This neuron circuit is compact and enables implementation of many neurons in a single silicon chip. The circuit consumes extremely low energy per spike (8pJ). Incorporating this neuron circuit in a neural network facilitates diverse non-linear neuron responses, which is an important aspect in neural processing. Two of the proposed long term plastic synapse circuits include spike-time dependent plasticity (STDP) synapse, and dopamine modulated STDP synapse. The short-term plastic synapses include excitatory depressing, inhibitory facilitating, inhibitory depressing, and excitatory facilitating synapses. Many neural parameters of short- and long- term synapses can be modified independently using externally controlled tuning voltages to obtain distinct synaptic properties. Having diverse synaptic dynamics in a network facilitates richer network behaviours such as learning, memory, stability and dynamic gain control, inherent in a biological neural network. To prove the concept in VLSI, different combinations of these accelerated-time neural circuits are fabricated in three integrated circuits (ICs) using a standard 0.35 µm CMOS technology. Using first two ICs, functions of cortical neuron and STDP synapses have been experimentally verified. The third IC, the Cortical Neural Layer (CNL) Chip is designed and fabricated to facilitate cortical network emulations. This IC implements neural circuits with a similar composition to the cortical layer of the neocortex. The CNL chip comprises 120 cortical neurons and 7 560 synapses. Many of these CNL chips can be combined together to form a six-layered VLSI neocortical network to validate the network dynamics and to perform neural processing of small-scale cortical networks. The proposed neuromorphic systems can be used as a simulation acceleration platform to explore the processing principles of biological brains and also move towards realising low power, real-time intelligent computing devices and control systems.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Mixed signal VLSI circuit implementation of the cortical microcircuit models

    Get PDF
    This thesis proposes a novel set of generic and compact biologically plausible VLSI (Very Large Scale Integration) neural circuits, suitable for implementing a parallel VLSI network that closely resembles the function of a small-scale neocortical network. The proposed circuits include a cortical neuron, two different long-term plastic synapses and four different short-term plastic synapses. These circuits operate in accelerated-time, where the time scale of neural responses is approximately three to four orders of magnitude faster than the biological-time scale of the neuronal activities, providing higher computational throughput in computing neural dynamics. Further, a novel biological-time cortical neuron circuit with similar dynamics as of the accelerated-time neuron is proposed to demonstrate the feasibility of migrating accelerated-time circuits into biological-time circuits. The fabricated accelerated-time VLSI neuron circuit is capable of replicating distinct firing patterns such as regular spiking, fast spiking, chattering and intrinsic bursting, by tuning two external voltages. It reproduces biologically plausible action potentials. This neuron circuit is compact and enables implementation of many neurons in a single silicon chip. The circuit consumes extremely low energy per spike (8pJ). Incorporating this neuron circuit in a neural network facilitates diverse non-linear neuron responses, which is an important aspect in neural processing. Two of the proposed long term plastic synapse circuits include spike-time dependent plasticity (STDP) synapse, and dopamine modulated STDP synapse. The short-term plastic synapses include excitatory depressing, inhibitory facilitating, inhibitory depressing, and excitatory facilitating synapses. Many neural parameters of short- and long- term synapses can be modified independently using externally controlled tuning voltages to obtain distinct synaptic properties. Having diverse synaptic dynamics in a network facilitates richer network behaviours such as learning, memory, stability and dynamic gain control, inherent in a biological neural network. To prove the concept in VLSI, different combinations of these accelerated-time neural circuits are fabricated in three integrated circuits (ICs) using a standard 0.35 µm CMOS technology. Using first two ICs, functions of cortical neuron and STDP synapses have been experimentally verified. The third IC, the Cortical Neural Layer (CNL) Chip is designed and fabricated to facilitate cortical network emulations. This IC implements neural circuits with a similar composition to the cortical layer of the neocortex. The CNL chip comprises 120 cortical neurons and 7 560 synapses. Many of these CNL chips can be combined together to form a six-layered VLSI neocortical network to validate the network dynamics and to perform neural processing of small-scale cortical networks. The proposed neuromorphic systems can be used as a simulation acceleration platform to explore the processing principles of biological brains and also move towards realising low power, real-time intelligent computing devices and control systems.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Emergence of Functional Specificity in Balanced Networks with Synaptic Plasticity

    Get PDF
    In rodent visual cortex, synaptic connections between orientation-selective neurons are unspecific at the time of eye opening, and become to some degree functionally specific only later during development. An explanation for this two-stage process was proposed in terms of Hebbian plasticity based on visual experience that would eventually enhance connections between neurons with similar response features. For this to work, however, two conditions must be satisfied: First, orientation selective neuronal responses must exist before specific recurrent synaptic connections can be established. Second, Hebbian learning must be compatible with the recurrent network dynamics contributing to orientation selectivity, and the resulting specific connectivity must remain stable for unspecific background activity. Previous studies have mainly focused on very simple models, where the receptive fields of neurons were essentially determined by feedforward mechanisms, and where the recurrent network was small, lacking the complex recurrent dynamics of large-scale networks of excitatory and inhibitory neurons. Here we studied the emergence of functionally specific connectivity in large-scale recurrent networks with synaptic plasticity. Our results show that balanced random networks, which already exhibit highly selective responses at eye opening, can develop feature-specific connectivity if appropriate rules of synaptic plasticity are invoked within and between excitatory and inhibitory populations. If these conditions are met, the initial orientation selectivity guides the process of Hebbian learning and, as a result, functionally specific and a surplus of bidirectional connections emerge. Our results thus demonstrate the cooperation of synaptic plasticity and recurrent dynamics in large-scale functional networks with realistic receptive fields, highlight the role of inhibition as a critical element in this process, and paves the road for further computational studies of sensory processing in neocortical network models equipped with synaptic plasticity

    Learning as a phenomenon occurring in a critical state

    Full text link
    Recent physiological measurements have provided clear evidence about scale-free avalanche brain activity and EEG spectra, feeding the classical enigma of how such a chaotic system can ever learn or respond in a controlled and reproducible way. Models for learning, like neural networks or perceptrons, have traditionally avoided strong fluctuations. Conversely, we propose that brain activity having features typical of systems at a critical point, represents a crucial ingredient for learning. We present here a study which provides novel insights toward the understanding of the problem. Our model is able to reproduce quantitatively the experimentally observed critical state of the brain and, at the same time, learns and remembers logical rules including the exclusive OR (XOR), which has posed difficulties to several previous attempts. We implement the model on a network with topological properties close to the functionality network in real brains. Learning occurs via plastic adaptation of synaptic strengths and exhibits universal features. We find that the learning performance and the average time required to learn are controlled by the strength of plastic adaptation, in a way independent of the specific task assigned to the system. Even complex rules can be learned provided that the plastic adaptation is sufficiently slow.Comment: 5 pages, 5 figure

    Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface

    Full text link
    Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). What are the neuronal mechanisms responsible for these changes and how does targeted stimulation by a BBCI shape population-level synaptic connectivity? The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites are strengthened for spike-stimulus delays consistent with experimentally derived spike time dependent plasticity (STDP) rules. However, the relationship between STDP mechanisms at the level of networks, and their modification with neural implants remains poorly understood. Using our model, we successfully reproduces key experimental results and use analytical derivations, along with novel experimental data. We then derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered stimulation in different regimes of cortical activity.Comment: 35 pages, 9 figure

    Activity-dependent neuronal model on complex networks

    Get PDF
    Neuronal avalanches are a novel mode of activity in neuronal networks, experimentally found in vitro and in vivo, and exhibit a robust critical behaviour: These avalanches are characterized by a power law distribution for the size and duration, features found in other problems in the context of the physics of complex systems. We present a recent model inspired in self-organized criticality, which consists of an electrical network with threshold firing, refractory period and activity-dependent synaptic plasticity. The model reproduces the critical behaviour of the distribution of avalanche sizes and durations measured experimentally. Moreover, the power spectra of the electrical signal reproduce very robustly the power law behaviour found in human electroencephalogram (EEG) spectra. We implement this model on a variety of complex networks, i.e. regular, small-world and scale-free and verify the robustness of the critical behaviour.Comment: 9 pages, 8 figure

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system
    corecore