Animals combine various sensory cues with previously
acquired knowledge to safely travel towards a target
destination. In close analogy to biological systems, we propose a
neuromorphic system which decides, based on auditory and visual
input, how to reach a sound source without collisions. The development
of this sensory integration system, which identifies the
shortest possible path, is a key achievement towards autonomous
robotics. The proposed neuromorphic system comprises two event
based sensors (the eDVS for vision and the NAS for audition) and
the SpiNNaker processor. Open loop experiments were performed
to evaluate the system performances. In the presence of acoustic
stimulation alone, the heading direction points to the direction
of the sound source with a Pearson correlation coefficient of
0.89. When visual input is introduced into the network the
heading direction always points at the direction of null optical
flow closest to the sound source. Hence, the sensory integration
network is able to find the shortest path to the sound source
while avoiding obstacles. This work shows that a simple, task
dependent mapping of sensory information can lead to highly
complex and robust decisions.Ministerio de Economía y Competitividad TEC2016-77785-