12,418 research outputs found
Event-based Vision: A Survey
Event cameras are bio-inspired sensors that differ from conventional frame
cameras: Instead of capturing images at a fixed rate, they asynchronously
measure per-pixel brightness changes, and output a stream of events that encode
the time, location and sign of the brightness changes. Event cameras offer
attractive properties compared to traditional cameras: high temporal resolution
(in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low
power consumption, and high pixel bandwidth (on the order of kHz) resulting in
reduced motion blur. Hence, event cameras have a large potential for robotics
and computer vision in challenging scenarios for traditional cameras, such as
low-latency, high speed, and high dynamic range. However, novel methods are
required to process the unconventional output of these sensors in order to
unlock their potential. This paper provides a comprehensive overview of the
emerging field of event-based vision, with a focus on the applications and the
algorithms developed to unlock the outstanding properties of event cameras. We
present event cameras from their working principle, the actual sensors that are
available and the tasks that they have been used for, from low-level vision
(feature detection and tracking, optic flow, etc.) to high-level vision
(reconstruction, segmentation, recognition). We also discuss the techniques
developed to process events, including learning-based techniques, as well as
specialized processors for these novel sensors, such as spiking neural
networks. Additionally, we highlight the challenges that remain to be tackled
and the opportunities that lie ahead in the search for a more efficient,
bio-inspired way for machines to perceive and interact with the world
From Vision Sensor to Actuators, Spike Based Robot Control through Address-Event-Representation
One field of the neuroscience is the neuroinformatic whose aim is to
develop auto-reconfigurable systems that mimic the human body and brain. In
this paper we present a neuro-inspired spike based mobile robot. From
commercial cheap vision sensors converted into spike information, through
spike filtering for object recognition, to spike based motor control models. A
two wheel mobile robot powered by DC motors can be autonomously
controlled to follow a line drown in the floor. This spike system has been
developed around the well-known Address-Event-Representation mechanism to
communicate the different neuro-inspired layers of the system. RTC lab has
developed all the components presented in this work, from the vision sensor, to
the robot platform and the FPGA based platforms for AER processing.Ministerio de Ciencia e Innovación TEC2006-11730-C03-02Junta de Andalucía P06-TIC-0141
Spike-based VITE control with Dynamic Vision Sensor applied to an Arm Robot.
Spike-based motor control is very important in the
field of robotics and also for the neuromorphic engineering
community to bridge the gap between sensing / processing
devices and motor control without losing the spike philosophy
that enhances speed response and reduces power consumption.
This paper shows an accurate neuro-inspired spike-based system
composed of a DVS retina, a visual processing system that detects
and tracks objects, and a SVITE motor control, where everything
follows the spike-based philosophy. The control system is a spike
version of the neuroinspired open loop VITE control algorithm
implemented in a couple of FPGA boards: the first one runs the
algorithm and the second one drives the motors with spikes. The
robotic platform is a low cost arm with four degrees of freedom.Ministerio de Ciencia e Innovación TEC2009-10639-C04-02/01Ministerio de Economía y Competitividad TEC2012-37868-C04-02/0
The implications of embodiment for behavior and cognition: animal and robotic case studies
In this paper, we will argue that if we want to understand the function of
the brain (or the control in the case of robots), we must understand how the
brain is embedded into the physical system, and how the organism interacts with
the real world. While embodiment has often been used in its trivial meaning,
i.e. 'intelligence requires a body', the concept has deeper and more important
implications, concerned with the relation between physical and information
(neural, control) processes. A number of case studies are presented to
illustrate the concept. These involve animals and robots and are concentrated
around locomotion, grasping, and visual perception. A theoretical scheme that
can be used to embed the diverse case studies will be presented. Finally, we
will establish a link between the low-level sensory-motor processes and
cognition. We will present an embodied view on categorization, and propose the
concepts of 'body schema' and 'forward models' as a natural extension of the
embodied approach toward first representations.Comment: Book chapter in W. Tschacher & C. Bergomi, ed., 'The Implications of
Embodiment: Cognition and Communication', Exeter: Imprint Academic, pp. 31-5
AER Neuro-Inspired interface to Anthropomorphic Robotic Hand
Address-Event-Representation (AER) is a
communication protocol for transferring asynchronous events
between VLSI chips, originally developed for neuro-inspired
processing systems (for example, image processing). Such
systems may consist of a complicated hierarchical structure
with many chips that transmit data among them in real time,
while performing some processing (for example, convolutions).
The information transmitted is a sequence of spikes coded using
high speed digital buses. These multi-layer and multi-chip AER
systems perform actually not only image processing, but also
audio processing, filtering, learning, locomotion, etc. This paper
present an AER interface for controlling an anthropomorphic
robotic hand with a neuro-inspired system.Unión Europea IST-2001-34124 (CAVIAR)Ministerio de Ciencia y Tecnología TIC-2003-08164-C03-02Ministerio de Ciencia y Tecnología TIC2000-0406-P4- 0
An AER-Based Actuator Interface for Controlling an Anthropomorphic Robotic Hand
Bio-Inspired and Neuro-Inspired systems or circuits are a
relatively novel approaches to solve real problems by mimicking the biology
in its efficient solutions. Robotic also tries to mimic the biology and
more particularly the human body structure and efficiency of the muscles,
bones, articulations, etc. Address-Event-Representation (AER) is
a communication protocol for transferring asynchronous events between
VLSI chips, originally developed for neuro-inspired processing systems
(for example, image processing). Such systems may consist of a complicated
hierarchical structure with many chips that transmit data among
them in real time, while performing some processing (for example, convolutions).
The information transmitted is a sequence of spikes coded using
high speed digital buses. These multi-layer and multi-chip AER systems
perform actually not only image processing, but also audio processing,
filtering, learning, locomotion, etc. This paper present an AER interface
for controlling an anthropomorphic robotic hand with a neuro-inspired
system.Unión Europea IST-2001-34124 (CAVIAR)Ministerio de Ciencia y Tecnología TIC-2003-08164-C03-0
Adaptive motor control and learning in a spiking neural network realised on a mixed-signal neuromorphic processor
Neuromorphic computing is a new paradigm for design of both the computing
hardware and algorithms inspired by biological neural networks. The event-based
nature and the inherent parallelism make neuromorphic computing a promising
paradigm for building efficient neural network based architectures for control
of fast and agile robots. In this paper, we present a spiking neural network
architecture that uses sensory feedback to control rotational velocity of a
robotic vehicle. When the velocity reaches the target value, the mapping from
the target velocity of the vehicle to the correct motor command, both
represented in the spiking neural network on the neuromorphic device, is
autonomously stored on the device using on-chip plastic synaptic weights. We
validate the controller using a wheel motor of a miniature mobile vehicle and
inertia measurement unit as the sensory feedback and demonstrate online
learning of a simple 'inverse model' in a two-layer spiking neural network on
the neuromorphic chip. The prototype neuromorphic device that features 256
spiking neurons allows us to realise a simple proof of concept architecture for
the purely neuromorphic motor control and learning. The architecture can be
easily scaled-up if a larger neuromorphic device is available.Comment: 6+1 pages, 4 figures, will appear in one of the Robotics conference
- …