5 research outputs found

    Neuromorphic Visual Odometry with Resonator Networks

    Full text link
    Autonomous agents require self-localization to navigate in unknown environments. They can use Visual Odometry (VO) to estimate self-motion and localize themselves using visual sensors. This motion-estimation strategy is not compromised by drift as inertial sensors or slippage as wheel encoders. However, VO with conventional cameras is computationally demanding, limiting its application in systems with strict low-latency, -memory, and -energy requirements. Using event-based cameras and neuromorphic computing hardware offers a promising low-power solution to the VO problem. However, conventional algorithms for VO are not readily convertible to neuromorphic hardware. In this work, we present a VO algorithm built entirely of neuronal building blocks suitable for neuromorphic implementation. The building blocks are groups of neurons representing vectors in the computational framework of Vector Symbolic Architecture (VSA) which was proposed as an abstraction layer to program neuromorphic hardware. The VO network we propose generates and stores a working memory of the presented visual environment. It updates this working memory while at the same time estimating the changing location and orientation of the camera. We demonstrate how VSA can be leveraged as a computing paradigm for neuromorphic robotics. Moreover, our results represent an important step towards using neuromorphic computing hardware for fast and power-efficient VO and the related task of simultaneous localization and mapping (SLAM). We validate this approach experimentally in a simple robotic task and with an event-based dataset, demonstrating state-of-the-art performance in these settings.Comment: 14 pages, 5 figures, minor change

    Event-Based Attention and Tracking on Neuromorphic Hardware

    Full text link
    We present a fully event-driven vision and processing system for selective attention and tracking implemented on Intel's neuromorphic research chip, Loihi, directly interfaced with an event-based Dynamic Vision Sensor, DAVIS. The attention mechanism is realized as a recurrent spiking neural network (SNN) that forms sustained activation-bump attractors. The network dynamics support object tracking when distractors are present and when the object slows down or stops

    Event-based attention and tracking on neuromorphic hardware

    Full text link
    We present a fully event-driven vision and processing system for selective attention and tracking, realized on a neuromorphic processor Loihi interfaced to an event-based Dynamic Vision Sensor DAVIS. The attention mechanism is realized as a recurrent spiking neural network that implements attractor-dynamics of dynamic neural fields. We demonstrate capability of the system to create sustained activation that supports object tracking when distractors are present or when the object slows down or stops, reducing the number of generated events
    corecore