13,259 research outputs found
Homogeneous Spiking Neuromorphic System for Real-World Pattern Recognition
A neuromorphic chip that combines CMOS analog spiking neurons and memristive
synapses offers a promising solution to brain-inspired computing, as it can
provide massive neural network parallelism and density. Previous hybrid analog
CMOS-memristor approaches required extensive CMOS circuitry for training, and
thus eliminated most of the density advantages gained by the adoption of
memristor synapses. Further, they used different waveforms for pre and
post-synaptic spikes that added undesirable circuit overhead. Here we describe
a hardware architecture that can feature a large number of memristor synapses
to learn real-world patterns. We present a versatile CMOS neuron that combines
integrate-and-fire behavior, drives passive memristors and implements
competitive learning in a compact circuit module, and enables in-situ
plasticity in the memristor synapses. We demonstrate handwritten-digits
recognition using the proposed architecture using transistor-level circuit
simulations. As the described neuromorphic architecture is homogeneous, it
realizes a fundamental building block for large-scale energy-efficient
brain-inspired silicon chips that could lead to next-generation cognitive
computing.Comment: This is a preprint of an article accepted for publication in IEEE
Journal on Emerging and Selected Topics in Circuits and Systems, vol 5, no.
2, June 201
Hardware-Amenable Structural Learning for Spike-based Pattern Classification using a Simple Model of Active Dendrites
This paper presents a spike-based model which employs neurons with
functionally distinct dendritic compartments for classifying high dimensional
binary patterns. The synaptic inputs arriving on each dendritic subunit are
nonlinearly processed before being linearly integrated at the soma, giving the
neuron a capacity to perform a large number of input-output mappings. The model
utilizes sparse synaptic connectivity; where each synapse takes a binary value.
The optimal connection pattern of a neuron is learned by using a simple
hardware-friendly, margin enhancing learning algorithm inspired by the
mechanism of structural plasticity in biological neurons. The learning
algorithm groups correlated synaptic inputs on the same dendritic branch. Since
the learning results in modified connection patterns, it can be incorporated
into current event-based neuromorphic systems with little overhead. This work
also presents a branch-specific spike-based version of this structural
plasticity rule. The proposed model is evaluated on benchmark binary
classification problems and its performance is compared against that achieved
using Support Vector Machine (SVM) and Extreme Learning Machine (ELM)
techniques. Our proposed method attains comparable performance while utilizing
10 to 50% less computational resources than the other reported techniques.Comment: Accepted for publication in Neural Computatio
Real time unsupervised learning of visual stimuli in neuromorphic VLSI systems
Neuromorphic chips embody computational principles operating in the nervous
system, into microelectronic devices. In this domain it is important to
identify computational primitives that theory and experiments suggest as
generic and reusable cognitive elements. One such element is provided by
attractor dynamics in recurrent networks. Point attractors are equilibrium
states of the dynamics (up to fluctuations), determined by the synaptic
structure of the network; a `basin' of attraction comprises all initial states
leading to a given attractor upon relaxation, hence making attractor dynamics
suitable to implement robust associative memory. The initial network state is
dictated by the stimulus, and relaxation to the attractor state implements the
retrieval of the corresponding memorized prototypical pattern. In a previous
work we demonstrated that a neuromorphic recurrent network of spiking neurons
and suitably chosen, fixed synapses supports attractor dynamics. Here we focus
on learning: activating on-chip synaptic plasticity and using a theory-driven
strategy for choosing network parameters, we show that autonomous learning,
following repeated presentation of simple visual stimuli, shapes a synaptic
connectivity supporting stimulus-selective attractors. Associative memory
develops on chip as the result of the coupled stimulus-driven neural activity
and ensuing synaptic dynamics, with no artificial separation between learning
and retrieval phases.Comment: submitted to Scientific Repor
Line-start permanent-magnet motor single-phase steady-state performance analysis
This paper describes an efficient calculating procedure for the steady-state operation of a single-phase line-start capacitor-run permanent-magnet motor. This class of motor is beginning to be applied in hermetic refrigerator compressors as a high-efficiency alternative to either a plain induction motor or a full inverter-fed drive. The calculation relies on a combination of reference-frame transformations including symmetrical components to cope with imbalance, and dq axes to cope with saliency. Computed results are compared with test data. The agreement is generally good, especially in describing the general properties of the motor. However, it is shown that certain important effects are beyond the limit of simple circuit analysis and require a more complex numerical analysis method
Six networks on a universal neuromorphic computing substrate
In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality
A differential memristive synapse circuit for on-line learning in neuromorphic computing systems
Spike-based learning with memristive devices in neuromorphic computing
architectures typically uses learning circuits that require overlapping pulses
from pre- and post-synaptic nodes. This imposes severe constraints on the
length of the pulses transmitted in the network, and on the network's
throughput. Furthermore, most of these circuits do not decouple the currents
flowing through memristive devices from the one stimulating the target neuron.
This can be a problem when using devices with high conductance values, because
of the resulting large currents. In this paper we propose a novel circuit that
decouples the current produced by the memristive device from the one used to
stimulate the post-synaptic neuron, by using a novel differential scheme based
on the Gilbert normalizer circuit. We show how this circuit is useful for
reducing the effect of variability in the memristive devices, and how it is
ideally suited for spike-based learning mechanisms that do not require
overlapping pre- and post-synaptic pulses. We demonstrate the features of the
proposed synapse circuit with SPICE simulations, and validate its learning
properties with high-level behavioral network simulations which use a
stochastic gradient descent learning rule in two classification tasks.Comment: 18 Pages main text, 9 pages of supplementary text, 19 figures.
Patente
Dopaminergic Regulation of Neuronal Circuits in Prefrontal Cortex
Neuromodulators, like dopamine, have considerable influence on the\ud
processing capabilities of neural networks. \ud
This has for instance been shown in the working memory functions\ud
of prefrontal cortex, which may be regulated by altering the\ud
dopamine level. Experimental work provides evidence on the biochemical\ud
and electrophysiological actions of dopamine receptors, but there are few \ud
theories concerning their significance for computational properties \ud
(ServanPrintzCohen90,Hasselmo94).\ud
We point to experimental data on neuromodulatory regulation of \ud
temporal properties of excitatory neurons and depolarization of inhibitory \ud
neurons, and suggest computational models employing these effects.\ud
Changes in membrane potential may be modelled by the firing threshold,\ud
and temporal properties by a parameterization of neuronal responsiveness \ud
according to the preceding spike interval.\ud
We apply these concepts to two examples using spiking neural networks.\ud
In the first case, there is a change in the input synchronization of\ud
neuronal groups, which leads to\ud
changes in the formation of synchronized neuronal ensembles.\ud
In the second case, the threshold\ud
of interneurons influences lateral inhibition, and the switch from a \ud
winner-take-all network to a parallel feedforward mode of processing.\ud
Both concepts are interesting for the modeling of cognitive functions and may\ud
have explanatory power for behavioral changes associated with dopamine \ud
regulation
- …