Real-Time Implementation of Vision-Aided Monocular Navigation for Small Fixed-Wing Unmanned Aerial Systems

Abstract

The goal of this project was to develop and implement algorithms to demonstrate real-time positioning of a UAV using a monocular camera combined with previously collected orthorectified imagery. Unlike previous tests, this project did not utilize a full inertial navigation system (INS) for attitude, but instead had to rely on the attitude obtained by inexpensive commercial off-the-shelf (COTS) autopilots. The system consisted of primarily COTS components and open-source software, and was own over Camp Atterbury, IN for a sequence of flight tests in Fall 2015. The system obtained valid solutions over much of the flight path, identifying features in the flight image, matching those features with a database of features, and then solving both the 6DOF solution, and an attitude-aided 3DOF solution. The tests demonstrated that such attitude aiding is beneficial, since the horizontal DRMS of the 6DOF solution was 59m, whereas the 3DOF solution DRMS was 15m. Post processing was done to improve the algorithm to correct for system errors, obtaining a 3DOF solution DRMS of 8.22 meters. Overall, this project increased our understanding of the capabilities and limitations of real-time vision-aided navigation, and demonstrated that such navigation is possible on a relatively small platform with limited computational power

    Similar works