Using Predictive Rendering as a Vision-Aided Technique for Autonomous Aerial Refueling

Abstract

This research effort seeks to characterize a vision-aided approach for an Unmanned Aerial System (UAS) to autonomously determine relative position to another aircraft in a formation, specifically to address the autonomous aerial refueling problem. A system consisting of a monocular digital camera coupled with inertial sensors onboard the UAS is analyzed for feasibility of using this vision-aided approach. A three-dimensional rendering of the tanker aircraft is used to generate predicted images of the tanker as seen by the receiver aircraft. A rigorous error model is developed to model the relative dynamics between an INS-equipped receiver and the tanker aircraft. A thorough image processing analysis is performed to determine error observability between the predicted and true images using sum-squared difference and gradient techniques. To quantify the errors between the predicted and true images, an image update function is developed using perturbation techniques. Based on this residual measurement and the inertial/dynamics propagation, an Extended Kalman Filter (EKF) is used to predict the relative position and orientation of the tanker from the receiver aircraft. The EKF is simulated through various formation positions during typical aerial refueling operations. Various grades of inertial sensors are simulated during different trajectories to analyze the system\u27s ability to accurately and robustly track the relative position between the two aircraft

    Similar works