4 research outputs found

    Visual attention using spiking neural maps

    Get PDF
    International audienceVisual attention is a mechanism that biological systems have developed to reduce the large amount of visual information in order to efficiently perform tasks such as learning, recognition, tracking, etc. In this paper we describe a simple spiking neural network model that is able to detect, focus on and track a stimulus even in the presence of noise or distracters. Instead of using a regular rate-coding neuron model based on the continuum neural field theory (CNFT), we propose to use a time-based code by means of a network composed of leaky integrate-and-fire (LIF) neurons. The proposal is experimentally compared against the usual CNFT-based model

    Novelty detection with self-organizing maps for autonomous extraction of salient tracking features

    Get PDF
    International audienceIn the image processing field, many tracking algorithms rely on prior knowledge like color, shape or even need a database of the objects to be tracked. This may be a problem for some real world applications that cannot fill those prerequisite. Based on image compression techniques, we propose to use Self-Organizing Maps to robustly detect novelty in the input video stream and to produce a saliency map which will outline unusual objects in the visual environment. This saliency map is then processed by a Dynamic Neural Field to extract a robust and continuous tracking of the position of the object. Our approach is solely based on unsupervised neural networks and does not need any prior knowledge, therefore it has a high adaptability to different inputs and a strong robustness to noisy environments

    Neural and Synaptic Array Transceiver: A Brain-Inspired Computing Framework for Embedded Learning

    Get PDF
    Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algorithmic framework. As a result, most neuromorphic hardware are trained off-line on large clusters of dedicated processors or GPUs and transferred post hoc to the device. We address this by introducing the neural and synaptic array transceiver (NSAT), a neuromorphic computational framework facilitating flexible and efficient embedded learning by matching algorithmic requirements and neural and synaptic dynamics. NSAT supports event-driven supervised, unsupervised and reinforcement learning algorithms including deep learning. We demonstrate the NSAT in a wide range of tasks, including the simulation of Mihalas-Niebur neuron, dynamic neural fields, event-driven random back-propagation for event-based deep learning, event-based contrastive divergence for unsupervised learning, and voltage-based learning rules for sequence learning. We anticipate that this contribution will establish the foundation for a new generation of devices enabling adaptive mobile systems, wearable devices, and robots with data-driven autonomy
    corecore