Visual Terrain Relative Navigation: Pose Estimation, Neural Fields, and Verification

Abstract

Visual Terrain Relative Navigation (TRN) is a method for GPS-denied absolute pose estimation using a prior terrain map and onboard camera. TRN is commonly desired for applications such as planetary landings, unmanned aerial vehicles (UAVs), and airdrops, where GPS is either unavailable or cannot be relied upon due to both the possibility of signal loss or outside signal jamming attack. This thesis presents a threefold constribution to visual TRN. Firstly, due to the high altitude and high speeds of planetary TRN missions, acquiring non-simulation test data oftentimes proves difficult, and thus many datasets used to test TRN systems are from lower altitudes and speeds than what the system would actually be deployed. We present an experimental analysis of visual TRN on data collected from a World View Enterprises high-altitude balloon from an altitude range of 33 km to 4.5 km. We demonstrate less than 290 meters of average position error over a trajectory of more than 150 kilometers. Additionally, we evaluate performance on data we collected by mounting two cameras inside the capsule of Blue Origin’s New Shepard rocket on payload flight NS-23, traveling at speeds up to 880 km/h, and demonstrate less than 55 meters of average position error. Secondly, as accurate terrain map representation is at the core of TRN performance, we explore the question of whether newly emerging Neural Radiance Fields (NeRF) can be efficiently leveraged as a map for visual localization. We propose a NeRF-based localization pipeline coined Loc-NeRF which uses a particle filter backbone to perform monocular camera pose estimation utilizing NeRF. Thirdly, since TRN is often performed in high-risk missions, we explore the problem of monitoring the correctness of a monocular camera pose estimate at runtime. For this, we again leverage the ability of NeRF to render novel viewpoints and propose a technique coined VERF that incorporates NeRF into a geometrically constrained method to provide assurance on the correctness of a camera pose estimate.S.M

    Similar works

    Full text

    thumbnail-image

    Available Versions