1,161 research outputs found
Perspective: Organic electronic materials and devices for neuromorphic engineering
Neuromorphic computing and engineering has been the focus of intense research
efforts that have been intensified recently by the mutation of Information and
Communication Technologies (ICT). In fact, new computing solutions and new
hardware platforms are expected to emerge to answer to the new needs and
challenges of our societies. In this revolution, lots of candidates
technologies are explored and will require leveraging of the pro and cons. In
this perspective paper belonging to the special issue on neuromorphic
engineering of Journal of Applied Physics, we focus on the current achievements
in the field of organic electronics and the potentialities and specificities of
this research field. We highlight how unique material features available
through organic materials can be used to engineer useful and promising
bioinspired devices and circuits. We also discuss about the opportunities that
organic electronic are offering for future research directions in the
neuromorphic engineering field
Using FPGA for visuo-motor control with a silicon retina and a humanoid robot
The address-event representation (AER) is a
neuromorphic communication protocol for transferring
asynchronous events between VLSI chips. The event
information is transferred using a high speed digital parallel
bus. This paper present an experiment based on AER for
visual sensing, processing and finally actuating a robot. The
AER output of a silicon retina is processed by an AER filter
implemented into a FPGA to produce a mimicking behaviour
in a humanoid robot (The RoboSapiens V2). We have
implemented the visual filter into the Spartan II FPGA of the
USB-AER platform and the Central Pattern Generator (CPG)
into the Spartan 3 FPGA of the AER-Robot platform, both
developed by authors.Unión Europea IST-2001-34124 (CAVIAR)Ministerio de Ciencia y Tecnología TIC-2003-08164-C03-0
Frontiers in Neuromorphic Engineering
Neurobiological processing systems are remarkable computational devices. They use slow, stochastic, and inhomogeneous computing elements and yet they outperform today’s most powerful computers at tasks such as vision, audition, and motor control, tasks that we perform nearly every moment that we are awake without much conscious thought or concern. Despite the vast amount of resources dedicated to the research and development of computing, information, and communication technologies, today’s fastest and largest computers are still not able to match biological systems at robustly accomplishing real-worl
Bio-Inspired Stereo Vision Calibration for Dynamic Vision Sensors
Many advances have been made in the eld of computer vision. Several recent research trends
have focused on mimicking human vision by using a stereo vision system. In multi-camera systems, a
calibration process is usually implemented to improve the results accuracy. However, these systems generate
a large amount of data to be processed; therefore, a powerful computer is required and, in many cases,
this cannot be done in real time. Neuromorphic Engineering attempts to create bio-inspired systems that
mimic the information processing that takes place in the human brain. This information is encoded using
pulses (or spikes) and the generated systems are much simpler (in computational operations and resources),
which allows them to perform similar tasks with much lower power consumption, thus these processes
can be developed over specialized hardware with real-time processing. In this work, a bio-inspired stereovision
system is presented, where a calibration mechanism for this system is implemented and evaluated
using several tests. The result is a novel calibration technique for a neuromorphic stereo vision system,
implemented over specialized hardware (FPGA - Field-Programmable Gate Array), which allows obtaining
reduced latencies on hardware implementation for stand-alone systems, and working in real time.Ministerio de Economía y Competitividad TEC2016-77785-PMinisterio de Economía y Competitividad TIN2016-80644-
A short curriculum of the robotics and technology of computer lab
Our research Lab is directed by Prof. Anton Civit. It is an interdisciplinary group of 23
researchers that carry out their teaching and researching labor at the Escuela
Politécnica Superior (Higher Polytechnic School) and the Escuela de Ingeniería
Informática (Computer Engineering School). The main research fields are: a)
Industrial and mobile Robotics, b) Neuro-inspired processing using electronic spikes,
c) Embedded and real-time systems, d) Parallel and massive processing computer
architecture, d) Information Technologies for rehabilitation, handicapped and elder
people, e) Web accessibility and usability
In this paper, the Lab history is presented and its main publications and research
projects over the last few years are summarized.Nuestro grupo de investigación está liderado por el profesor Civit. Somos un grupo
multidisciplinar de 23 investigadores que realizan su labor docente e investigadora
en la Escuela Politécnica Superior y en Escuela de Ingeniería Informática. Las
principales líneas de investigaciones son: a) Robótica industrial y móvil. b)
Procesamiento neuro-inspirado basado en pulsos electrónicos. c) Sistemas
empotrados y de tiempo real. d) Arquitecturas paralelas y de procesamiento masivo.
e) Tecnología de la información aplicada a la discapacidad, rehabilitación y a las
personas mayores. f) Usabilidad y accesibilidad Web.
En este artículo se reseña la historia del grupo y se resumen las principales
publicaciones y proyectos que ha conseguido en los últimos años
Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades
Creating datasets for Neuromorphic Vision is a challenging task. A lack of
available recordings from Neuromorphic Vision sensors means that data must
typically be recorded specifically for dataset creation rather than collecting
and labelling existing data. The task is further complicated by a desire to
simultaneously provide traditional frame-based recordings to allow for direct
comparison with traditional Computer Vision algorithms. Here we propose a
method for converting existing Computer Vision static image datasets into
Neuromorphic Vision datasets using an actuated pan-tilt camera platform. Moving
the sensor rather than the scene or image is a more biologically realistic
approach to sensing and eliminates timing artifacts introduced by monitor
updates when simulating motion on a computer monitor. We present conversion of
two popular image datasets (MNIST and Caltech101) which have played important
roles in the development of Computer Vision, and we provide performance metrics
on these datasets using spike-based recognition algorithms. This work
contributes datasets for future use in the field, as well as results from
spike-based algorithms against which future works can compare. Furthermore, by
converting datasets already popular in Computer Vision, we enable more direct
comparison with frame-based approaches.Comment: 10 pages, 6 figures in Frontiers in Neuromorphic Engineering, special
topic on Benchmarks and Challenges for Neuromorphic Engineering, 2015 (under
review
Six networks on a universal neuromorphic computing substrate
In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality
Neuro-inspired system for real-time vision sensor tilt correction
Neuromorphic engineering tries to mimic biological
information processing. Address-Event-Representation (AER)
is an asynchronous protocol for transferring the information of
spiking neuro-inspired systems. Currently AER systems are able
sense visual and auditory stimulus, to process information, to
learn, to control robots, etc. In this paper we present an AER
based layer able to correct in real time the tilt of an AER vision
sensor, using a high speed algorithmic mapping layer. A codesign
platform (the AER-Robot platform), with a Xilinx
Spartan 3 FPGA and an 8051 USB microcontroller, has been
used to implement the system. Testing it with the help of the
USBAERmini2 board and the jAER software.Junta de Andalucía P06-TIC-01417Ministerio de Educación y Ciencia TEC2006-11730-C03-02Ministerio de Ciencia e Innovación TEC2009-10639-C04-0
Frequency Analysis of a 64x64 Pixel Retinomorphic System with AER Output to Estimate the Limits to Apply onto Specific Mechanical Environment
The rods and cones of a human retina are constantly sensing and
transmitting the light in the form of spikes to the cortex of the brain in order to
reproduce an image in the brain. Delbruck’s lab has designed and manufactured
several generations of spike based image sensors that mimic the human retina.
In this paper we present an exhaustive timing analysis of the Address-Event-
Representation (AER) output of a 64x64 pixels silicon retinomorphic system.
Two different scenarios are presented in order to achieve the maximum
frequency of light changes for a pixel sensor and the maximum frequency of
requested directions on the output AER. Results obtained are 100 Hz and 1.66
MHz in each case respectively. We have tested the upper spin limit and found it
to be approximately 6000rpm (revolutions per minute) and in some cases with
high light contrast lost events do not exist.Ministerio de Ciencia e Innovación TEC2009-10639- C04-0
AER Auditory Filtering and CPG for Robot Control
Address-Event-Representation (AER) is a
communication protocol for transferring asynchronous events
between VLSI chips, originally developed for bio-inspired
processing systems (for example, image processing). The event
information in an AER system is transferred using a highspeed
digital parallel bus. This paper presents an experiment
using AER for sensing, processing and finally actuating a
Robot. The AER output of a silicon cochlea is processed by an
AER filter implemented on a FPGA to produce rhythmic
walking in a humanoid robot (Redbot). We have implemented
both the AER rhythm detector and the Central Pattern
Generator (CPG) on a Spartan II FPGA which is part of a
USB-AER platform developed by some of the authors.Commission of the European Communities IST-2001-34124 (CAVIAR)Comisión Interministerial de Ciencia y Tecnología TIC-2003-08164-C03-0
- …