Robust Onboard Visual SLAM for Autonomous MAVs

Abstract

Abstract. This paper presents a visual simultaneous localization and mapping (SLAM) system consisting of a robust visual odometry and an efficient back-end with loop closure detection and pose-graph optimization. Robustness of the visual odometry is achieved by utilizing dual cameras pointing different directions with no overlap in their respective fields of view mounted on an micro aerial vehicle (MAV). The theory behind this dual-camera visual odometry can be easily ex-tended to applications with multiple cameras. The back-end of the SLAM system maintains a keyframe-based global map, which is used for loop closure detec-tion. An adaptive-window pose-graph optimization method is proposed to refine keyframe poses of the global map and thus correct pose drift that is inherent in the visual odometry. The position of each map point is then refined implicitly due to its relative representation to its source keyframe. We demonstrate the efficiency of the proposed visual SLAM algorithm for applications onboard MAVs in ex-periments with both autonomous and manual flights. The pose tracking results are compared with the ground truth data provided by an external tracking system.

    Similar works

    Full text

    thumbnail-image

    Available Versions