162 research outputs found
Learning and discrimination through STDP in a top-down modulated associative memory
This article underlines the learning and discrimination capabilities of a
model of associative memory based on artificial networks of spiking neurons.
Inspired from neuropsychology and neurobiology, the model implements top-down
modulations, as in neocortical layer V pyramidal neurons, with a learning rule
based on synaptic plasticity (STDP), for performing a multimodal association
learning task. A temporal correlation method of analysis proves the ability of
the model to associate specific activity patterns to different samples of
stimulation. Even in the absence of initial learning and with continuously
varying weights, the activity patterns become stable enough for discrimination
An associative memory for the on-line recognition and prediction of temporal sequences
This paper presents the design of an associative memory with feedback that is
capable of on-line temporal sequence learning. A framework for on-line sequence
learning has been proposed, and different sequence learning models have been
analysed according to this framework. The network model is an associative
memory with a separate store for the sequence context of a symbol. A sparse
distributed memory is used to gain scalability. The context store combines the
functionality of a neural layer with a shift register. The sensitivity of the
machine to the sequence context is controllable, resulting in different
characteristic behaviours. The model can store and predict on-line sequences of
various types and length. Numerical simulations on the model have been carried
out to determine its properties.Comment: Published in IJCNN 2005, Montreal, Canad
Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks
The spiking neural networks (SNNs) are considered as one of the most
promising artificial neural networks due to their energy efficient computing
capability. Recently, conversion of a trained deep neural network to an SNN has
improved the accuracy of deep SNNs. However, most of the previous studies have
not achieved satisfactory results in terms of inference speed and energy
efficiency. In this paper, we propose a fast and energy-efficient information
transmission method with burst spikes and hybrid neural coding scheme in deep
SNNs. Our experimental results showed the proposed methods can improve
inference energy efficiency and shorten the latency.Comment: Accepted to DAC 201
Director Field Model of the Primary Visual Cortex for Contour Detection
We aim to build the simplest possible model capable of detecting long, noisy
contours in a cluttered visual scene. For this, we model the neural dynamics in
the primate primary visual cortex in terms of a continuous director field that
describes the average rate and the average orientational preference of active
neurons at a particular point in the cortex. We then use a linear-nonlinear
dynamical model with long range connectivity patterns to enforce long-range
statistical context present in the analyzed images. The resulting model has
substantially fewer degrees of freedom than traditional models, and yet it can
distinguish large contiguous objects from the background clutter by suppressing
the clutter and by filling-in occluded elements of object contours. This
results in high-precision, high-recall detection of large objects in cluttered
scenes. Parenthetically, our model has a direct correspondence with the Landau
- de Gennes theory of nematic liquid crystal in two dimensions.Comment: 9 pages, 7 figure
Multi-layered Spiking Neural Network with Target Timestamp Threshold Adaptation and STDP
Spiking neural networks (SNNs) are good candidates to produce
ultra-energy-efficient hardware. However, the performance of these models is
currently behind traditional methods. Introducing multi-layered SNNs is a
promising way to reduce this gap. We propose in this paper a new threshold
adaptation system which uses a timestamp objective at which neurons should
fire. We show that our method leads to state-of-the-art classification rates on
the MNIST dataset (98.60%) and the Faces/Motorbikes dataset (99.46%) with an
unsupervised SNN followed by a linear SVM. We also investigate the sparsity
level of the network by testing different inhibition policies and STDP rules
Hardware design of LIF with Latency neuron model with memristive STDP synapses
In this paper, the hardware implementation of a neuromorphic system is
presented. This system is composed of a Leaky Integrate-and-Fire with Latency
(LIFL) neuron and a Spike-Timing Dependent Plasticity (STDP) synapse. LIFL
neuron model allows to encode more information than the common
Integrate-and-Fire models, typically considered for neuromorphic
implementations. In our system LIFL neuron is implemented using CMOS circuits
while memristor is used for the implementation of the STDP synapse. A
description of the entire circuit is provided. Finally, the capabilities of the
proposed architecture have been evaluated by simulating a motif composed of
three neurons and two synapses. The simulation results confirm the validity of
the proposed system and its suitability for the design of more complex spiking
neural network
- …