Vision Aided Inertial Navigation System Augmented with a Coded Aperture

Abstract

Navigation through an indoor environment is a formidable challenge for an autonomous micro air vehicle. One solution is a vision aided inertial navigation system using depth-from-defocus to determine heading and depth to features in the scene. Depth-from-defocus uses a focal blur pattern to estimate depth. As depth increases, the observable change in the focal blur is generally reduced. Consequently, as the depth of a feature to be measured increases, the measurement performance decreases. The Fresnel zone plate, used as an aperture, introduces multiple focal planes. Interference between the multiple focal planes produce changes in the aperture that extend the depth at which changes in the focal blur are observable. This improved depth measurement performance results in improved performance of the vision aided navigation system as well. This research provides an in-depth study of the Fresnel zone plate used as a coded aperture and the performance improvement obtained by augmenting a single camera vision aided inertial navigation system

    Similar works