1,332 research outputs found

    Recursive estimation of three-dimensional aircraft position using terrain-aided positioning

    Get PDF
    Abstract As a part of aircraft navigation three-dimensional position (horizontal position and altitude) must be computed continuously. For accuracy and reliability reasons several sensors are usually integrated together, and here we are dealing with dead-reckoning integrated with terrain-aided positioning. Terrain-aided positioning suffers from severe nonlinear structure, meaning that we have to solve a nonlinear recursive Bayesian estimation problem. This is not possible to do exactly, but recursive Monte Carlo methods, also known as particle filters, provide a promising approximate solution. To reduce the computational load of the normally rather computer intensive particle filter we present algorithms which take advantage of linear structure. These algorithms are all based on a Rao-Blackwellisation technique, i.e. we marginalise the full conditional posterior density with respect to the linear part, which here is altitude. The algorithms differ in the way the linear part is estimated, but the principle is to use multiple Kalman filters. The particle filter is then used for estimating horizontal position only. Simulations show that the computational load is reduced significantly

    Improving Real-World Performance of Vision Aided Navigation in a Flight Environment

    Get PDF
    The motivation of this research is to fuse information from an airborne imaging sensor with information extracted from satellite imagery in order to provide accurate position when GPS is unavailable for an extended duration. A corpus of existing geo-referenced satellite imagery is used to create a key point database. A novel algorithm for recovering coarse pose using by comparing key points extracted from the airborne imagery to the reference database is developed. This coarse position is used to bootstrap a local-area geo-registration algorithm, which provides GPS-level position estimates. This research derives optimizations for existing local-area methods for operation in flight environments

    Development and Flight of a Robust Optical-Inertial Navigation System Using Low-Cost Sensors

    Get PDF
    This research develops and tests a precision navigation algorithm fusing optical and inertial measurements of unknown objects at unknown locations. It provides an alternative to the Global Positioning System (GPS) as a precision navigation source, enabling passive and low-cost navigation in situations where GPS is denied/unavailable. This paper describes two new contributions. First, a rigorous study of the fundamental nature of optical/inertial navigation is accomplished by examining the observability grammian of the underlying measurement equations. This analysis yields a set of design principles guiding the development of optical/inertial navigation algorithms. The second contribution of this research is the development and flight test of an optical-inertial navigation system using low-cost and passive sensors (including an inexpensive commercial-grade inertial sensor, which is unsuitable for navigation by itself). This prototype system was built and flight tested at the U.S. Air Force Test Pilot School. The algorithm that was implemented leveraged the design principles described above, and used images from a single camera. It was shown (and explained by the observability analysis) that the system gained significant performance by aiding it with a barometric altimeter and magnetic compass, and by using a digital terrain database (DTED). The (still) low-cost and passive system demonstrated performance comparable to high quality navigation-grade inertial navigation systems, which cost an order of magnitude more than this optical-inertial prototype. The resultant performance of the system tested provides a robust and practical navigation solution for Air Force aircraft

    Automated Driftmeter Fused with Inertial Navigation

    Get PDF
    The motivation of this research is to address the use of bearing-only measurements taken by an optical sensor to aid an Inertial Navigation System (INS) whose accelerometers and gyroscopes are subject to drift and bias errors. The concept of Simultaneous Localization And Mapping (SLAM) is employed in a bootstrapping manner: the bearing measurements are used to geolocate ground features, following which the bearings taken over time of the said ground features are used to improve the navigation state provided by the INS. In this research the INS aiding action of tracking stationary, but unknown, ground features over time is evaluated. It does not, however, address the critical image registration issue associated with image processing. It is assumed that stationary ground features are able to be detected and tracked as pixel representations by a real-time image processing algorithm. Simulations are performed which indicate the potential of this research. It is shown that during wings level flight at constant speed and fixed altitude, an aircraft that geolocates and tracks ground objects can significantly reduce the error in two of its three dimensions of flight, relative to an Earth-fixed navigation frame. The aiding action of geolocating and tracking ground features, in-line with the direction of flight, with a downward facing camera did not provide improvement in the aircraft\u27s x-position estimate. However, the aircraft\u27s y-position estimate, as well as the altitude estimate, were signicantly improved

    Fusion of Imaging and Inertial Sensors for Navigation

    Get PDF
    The motivation of this research is to address the limitations of satellite-based navigation by fusing imaging and inertial systems. The research begins by rigorously describing the imaging and navigation problem and developing practical models of the sensors, then presenting a transformation technique to detect features within an image. Given a set of features, a statistical feature projection technique is developed which utilizes inertial measurements to predict vectors in the feature space between images. This coupling of the imaging and inertial sensors at a deep level is then used to aid the statistical feature matching function. The feature matches and inertial measurements are then used to estimate the navigation trajectory using an extended Kalman filter. After accomplishing a proper calibration, the image-aided inertial navigation algorithm is then tested using a combination of simulation and ground tests using both tactical and consumer- grade inertial sensors. While limitations of the Kalman filter are identified, the experimental results demonstrate a navigation performance improvement of at least two orders of magnitude over the respective inertial-only solutions

    Inertial Navigation System Aiding Using Vision

    Get PDF
    The aiding of an INS using measurements over time of the line of sight of ground features as they come into view of an onboard camera is investigated. The objective is to quantify the reduction in the navigation states\u27 errors by using bearings-only measurements over time of terrain features in the aircraft\u27s field of view. INS aiding is achieved through the use of a Kalman Filter. The design of the Kalman Filter is presented and it is shown that during a long range, wings level cruising flight at constant velocity and altitude, a 90% reduction in the aided INS-calculated navigation state errors compared to a free INS, is possible

    Real-time Aerial Magnetic and Vision-aided Navigation

    Get PDF
    Aerial magnetic navigation has shown to be a viable alternative navigation method that has the potential for world-wide availability, to include over oceans. Obtaining GPS-level accuracy using magnetic navigation alone is challenging, but magnetic navigation can be combined with other alternative navigation methods that are more posed to obtaining GPS-level accuracy in their current state. This research presents an aerial navigation solution combining magnetic navigation and vision-aided navigation to aid an inertial navigation system (INS). The navigation solution was demonstrated in real-time playback using simulated magnetic field measurements and flight-test captured visual imagery. Additionally, the navigation solution was flight-tested on a USAF F-16 to demonstrate magnetic navigation in the challenging magnetic environment seen on operationally representative dynamic platforms

    On the Integration of Medium Wave Infrared Cameras for Vision-Based Navigation

    Get PDF
    The ubiquitous nature of GPS has fostered its widespread integration of navigation into a variety of applications, both civilian and military. One alternative to ensure continued flight operations in GPS-denied environments is vision-aided navigation, an approach that combines visual cues from a camera with an inertial measurement unit (IMU) to estimate the navigation states of a moving body. The majority of vision-based navigation research has been conducted in the electro-optical (EO) spectrum, which experiences limited operation in certain environments. The aim of this work is to explore how such approaches extend to infrared imaging sensors. In particular, it examines the ability of medium-wave infrared (MWIR) imagery, which is capable of operating at night and with increased vision through smoke, to expand the breadth of operations that can be supported by vision-aided navigation. The experiments presented here are based on the Minor Area Motion Imagery (MAMI) dataset that recorded GPS data, inertial measurements, EO imagery, and MWIR imagery captured during flights over Wright-Patterson Air Force Base. The approach applied here combines inertial measurements with EO position estimates from the structure from motion (SfM) algorithm. Although precision timing was not available for the MWIR imagery, the EO-based results of the scene demonstrate that trajectory estimates from SfM offer a significant increase in navigation accuracy when combined with inertial data over using an IMU alone. Results also demonstrated that MWIR-based positions solutions provide a similar trajectory reconstruction to EO-based solutions for the same scenes. While the MWIR imagery and the IMU could not be combined directly, through comparison to the combined solution using EO data the conclusion here is that MWIR imagery (with its unique phenomenologies) is capable of expanding the operating envelope of vision-aided navigation

    Particle filter theory and practice with positioning applications

    Full text link
    • …
    corecore