Monocular Parallel Tracking and Mapping with Odometry Fusion for MAV Navigation in Feature-Lacking Environments

Abstract

©2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Presented at the IEEE/RSJ International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments (IROS 2013), November 7, 2013, Tokyo, Japan.Despite recent progress, autonomous navigation on Micro Aerial Vehicles with a single frontal camera is still a challenging problem, especially in feature-lacking environ- ments. On a mobile robot with a frontal camera, monoSLAM can fail when there are not enough visual features in the scene, or when the robot, with rotationally dominant motions, yaws away from a known map toward unknown regions. To overcome such limitations and increase responsiveness, we present a novel parallel tracking and mapping framework that is suitable for robot navigation by fusing visual data with odometry measurements in a principled manner. Our framework can cope with a lack of visual features in the scene, and maintain robustness during pure camera rotations. We demonstrate our results on a dataset captured from the frontal camera of a quad- rotor flying in a typical feature-lacking indoor environment

    Similar works