858 research outputs found

    Optimal Image-Aided Inertial Navigation

    Get PDF
    The utilization of cameras in integrated navigation systems is among the most recent scientific research and high-tech industry development. The research is motivated by the requirement of calibrating off-the-shelf cameras and the fusion of imaging and inertial sensors in poor GNSS environments. The three major contributions of this dissertation are The development of a structureless camera auto-calibration and system calibration algorithm for a GNSS, IMU and stereo camera system. The auto-calibration bundle adjustment utilizes the scale restraint equation, which is free of object coordinates. The number of parameters to be estimated is significantly reduced in comparison with the ones in a self-calibrating bundle adjustment based on the collinearity equations. Therefore, the proposed method is computationally more efficient. The development of a loosely-coupled visual odometry aided inertial navigation algorithm. The fusion of the two sensors is usually performed using a Kalman filter. The pose changes are pairwise time-correlated, i.e. the measurement noise vector at the current epoch is only correlated with the one from the previous epoch. Time-correlated errors are usually modelled by a shaping filter. The shaping filter developed in this dissertation uses Cholesky factors as coefficients derived from the variance and covariance matrices of the measurement noise vectors. Test results with showed that the proposed algorithm performs better than the existing ones and provides more realistic covariance estimates. The development of a tightly-coupled stereo multi-frame aided inertial navigation algorithm for reducing position and orientation drifts. Usually, the image aiding based on the visual odometry uses the tracked features only from a pair of the consecutive image frames. The proposed method integrates the features tracked from multiple overlapped image frames for reducing the position and orientation drifts. The measurement equation is derived from SLAM measurement equation system where the landmark positions in SLAM are algebraically by time-differencing. However, the derived measurements are time-correlated. Through a sequential de-correlation, the Kalman filter measurement update can be performed sequentially and optimally. The main advantages of the proposed algorithm are the reduction of computational requirements when compared to SLAM and a seamless integration into an existing GNSS aided-IMU system

    Development and Flight of a Robust Optical-Inertial Navigation System Using Low-Cost Sensors

    Get PDF
    This research develops and tests a precision navigation algorithm fusing optical and inertial measurements of unknown objects at unknown locations. It provides an alternative to the Global Positioning System (GPS) as a precision navigation source, enabling passive and low-cost navigation in situations where GPS is denied/unavailable. This paper describes two new contributions. First, a rigorous study of the fundamental nature of optical/inertial navigation is accomplished by examining the observability grammian of the underlying measurement equations. This analysis yields a set of design principles guiding the development of optical/inertial navigation algorithms. The second contribution of this research is the development and flight test of an optical-inertial navigation system using low-cost and passive sensors (including an inexpensive commercial-grade inertial sensor, which is unsuitable for navigation by itself). This prototype system was built and flight tested at the U.S. Air Force Test Pilot School. The algorithm that was implemented leveraged the design principles described above, and used images from a single camera. It was shown (and explained by the observability analysis) that the system gained significant performance by aiding it with a barometric altimeter and magnetic compass, and by using a digital terrain database (DTED). The (still) low-cost and passive system demonstrated performance comparable to high quality navigation-grade inertial navigation systems, which cost an order of magnitude more than this optical-inertial prototype. The resultant performance of the system tested provides a robust and practical navigation solution for Air Force aircraft

    Impacts of Distributions and Trajectories on Navigation Uncertainty Using Line-of-Sight Measurements to Known Landmarks in GPS-Denied Environments

    Get PDF
    Unmanned vehicles are increasingly common in our world today. Self-driving ground vehicles and unmanned aerial vehicles (UAVs) such as quadcopters have become the fastest growing area of automated vehicles research. These systems use three main processes to autonomously travel from one location to another: guidance, navigation, and controls (GNC). Guidance refers to the process of determining a desired path of travel or trajectory, affecting velocities and orientations. Examples of guidance activities include path planning and obstacle avoidance. Effective guidance decisions require knowledge of one’s current location. Navigation systems typically answer questions such as: “Where am I? What is my orientation? How fast am I going?” Finally, the process is tied together when controls are implemented. Controls use navigation estimates (e.g., “Where I am now?”) and the desired trajectory from guidance processes (e.g., “Where do I want to be?”) to control the moving parts of the system to accomplish relevant goals. Navigation in autonomous vehicles involves intelligently combining information from several sensors to produce accurate state estimations. To date, global positioning systems(GPS) occupy a crucial place in most navigation systems. However, GPS is not universally reliable. Even when available, GPS can be easily spoofed or jammed, rendering it useless. Thus, navigation within GPS-denied environments is an area of deep interest in both military and civilian applications. Image-aided inertial navigation is an alternative navigational solution in GPS-denied environments. One form of image-aided navigation measures the bearing from the vehicle to a feature or landmark of known location using a single lens imager, such as a camera, to deduce information about the vehicle’s position and attitude. This work uncovers and explores several of the impacts of trajectories and land mark distributions on the navigation information gained from this type of aiding measurement. To do so, a modular system model and extended Kalman filter (EKF) are described and implemented. A quadrotor system model is first presented. This model is implemented and then used to produce sensor data for several trajectories of varying shape, altitude, and landmark density. Next, navigation data is produced by running the sensor data through an EKF. The data is plotted and examined to determine effects of each variable. These effects are then explained. Finally, an equation describing the quantity of information in each measurement is derived and related to the patterns seen in the data. The resulting equation is then used to explain selected patterns in the data. Other uses of this equation are presented, including applications to path planning and landmark placement

    Deeply-Integrated Feature Tracking for Embedded Navigation

    Get PDF
    The Air Force Institute of Technology (AFIT) is investigating techniques to improve aircraft navigation using low-cost imaging and inertial sensors. Stationary features tracked within the image are used to improve the inertial navigation estimate. These features are tracked using a correspondence search between frames. Previous research investigated aiding these correspondence searches using inertial measurements (i.e., stochastic projection). While this research demonstrated the benefits of further sensor integration, it still relied on robust feature descriptors (e.g., SIFT or SURF) to obtain a reliable correspondence match in the presence of rotation and scale changes. Unfortunately, these robust feature extraction algorithms are computationally intensive and require significant resources for real-time operation. Simpler feature extraction algorithms are much more efficient, but their feature descriptors are not invariant to scale, rotation, or affine warping which limits matching performance during arbitrary motion. This research uses inertial measurements to predict not only the location of the feature in the next image but also the feature descriptor, resulting in robust correspondence matching with low computational overhead. This novel technique, called deeply-integrated feature tracking, is exercised using real imagery. The term deep integration is derived from the fact inertial information is used to aid the image processing. The navigation experiments presented demonstrate the performance of the new algorithm in relation to the previous work. Further experiments also investigate a monocular camera setup necessary for actual flight testing. Results show that the new algorithm is 12 times faster than its predecessor while still producing an accurate trajectory. Thirty-percent more features were initialized using the new tracker over the previous algorithm. However, low-level aiding techniques successfully reduced the number of features initialized indicating a more robust tracking solution through deep integration

    Tightly Integrating Optical and Inertial Sensors for Navigation Using the UKF

    Get PDF
    The motivation of this research is to address the benefits of tightly integrating optical and inertial sensors where GNSS signals are not available. The research begins with describing the navigation problem. Then, error and measurement models are presented. Given a set of features, a feature detection and projection algorithm is developed which utilizes inertial measurements to predict vectors in the feature space between images. The unscented Kalman filter is applied to the navigation system using the inertial measurements and feature matches to estimate the navigation trajectory. Finally, the image-aided navigation algorithm is tested using a simulation and an experiment. As a result, the optical measurements combined with the inertial sensors result in improved performance for non-GNSS based navigation

    Two Dimensional Positioning and Heading Solution for Flying Vehicles Using a Line-Scanning Laser Radar (LADAR)

    Get PDF
    Emerging technology in small autonomous flying vehicles requires the systems to have a precise navigation solution in order to perform tasks. In many critical environments, such as indoors, GPS is unavailable necessitating the development of supplemental aiding sensors to determine precise position. This research investigates the use of a line scanning laser radar (LADAR) as a standalone two dimensional position and heading navigation solution and sets up the device for augmentation into existing navigation systems. A fast histogram correlation method is developed to operate in real-time on board the vehicle providing position and heading updates at a rate of 10 Hz. LADAR navigation methods are adapted to 3 dimensions with a simulation built to analyze performance loss due attitude changes during flight. These simulations are then compared to experimental results collected using SICK LD-OEM 1000 mounted a cart traversing. The histogram correlation algorithm applied in this work was shown to successfully navigate a realistic environment where a quadrotor in short flights of less than 5 min in larger rooms. Application in hallways show great promise providing a stable heading along with tracking movement perpendicular to the hallway

    Airborne vision-based attitude estimation and localisation

    Get PDF
    Vision plays an integral part in a pilot's ability to navigate and control an aircraft. Therefore Visual Flight Rules have been developed around the pilot's ability to see the environment outside of the cockpit in order to control the attitude of the aircraft, to navigate and to avoid obstacles. The automation of these processes using a vision system could greatly increase the reliability and autonomy of unmanned aircraft and flight automation systems. This thesis investigates the development and implementation of a robust vision system which fuses inertial information with visual information in a probabilistic framework with the aim of aircraft navigation. The horizon appearance is a strong visual indicator of the attitude of the aircraft. This leads to the first research area of this thesis, visual horizon attitude determination. An image processing method was developed to provide high performance horizon detection and extraction from camera imagery. A number of horizon models were developed to link the detected horizon to the attitude of the aircraft with varying degrees of accuracy. The second area investigated in this thesis was visual localisation of the aircraft. A terrain-aided horizon model was developed to estimate the position, altitude as well as attitude of the aircraft. This gives rough positions estimates with highly accurate attitude information. The visual localisation accuracy was improved by incorporating ground feature-based map-aided navigation. Road intersections were detected using a developed image processing algorithm and then they were matched to a database to provide positional information. The developed vision system show comparable performance to other non-vision-based systems while removing the dependence on external systems for navigation. The vision system and techniques developed in this thesis helps to increase the autonomy of unmanned aircraft and flight automation systems for manned flight

    Fusion of Imaging and Inertial Sensors for Navigation

    Get PDF
    The motivation of this research is to address the limitations of satellite-based navigation by fusing imaging and inertial systems. The research begins by rigorously describing the imaging and navigation problem and developing practical models of the sensors, then presenting a transformation technique to detect features within an image. Given a set of features, a statistical feature projection technique is developed which utilizes inertial measurements to predict vectors in the feature space between images. This coupling of the imaging and inertial sensors at a deep level is then used to aid the statistical feature matching function. The feature matches and inertial measurements are then used to estimate the navigation trajectory using an extended Kalman filter. After accomplishing a proper calibration, the image-aided inertial navigation algorithm is then tested using a combination of simulation and ground tests using both tactical and consumer- grade inertial sensors. While limitations of the Kalman filter are identified, the experimental results demonstrate a navigation performance improvement of at least two orders of magnitude over the respective inertial-only solutions

    Improving Navigation in GNSS-challenging Environments: Multi-UAS Cooperation and Generalized Dilution of Precision

    Get PDF
    This paper presents an approach to tackle navigation challenges for Unmanned Aircraft Systems flying under non nominal GNSS coverage. The concept used to improve navigation performance in these environments consists in using one or more cooperative platforms and relative sensing measurements (based on vision and/or ranging) to the navigation aid. The paper details the cooperative navigation filter which can exploit multiple cooperative platforms and multiple relative measurements, while also using partial GNSS information. The achievable navigation accuracy can be predicted using the concept of "generalized dilution of precision", which derives from applying the idea of dilution of precision to the mathematical structure of the cooperative navigation filter. Values and trends of generalized dilution of precision are discussed as a function of the relative geometry in common GNSS-challenging scenarios. Finally, navigation performance is assessed based on simulations and on multi-drone flight tests
    • …
    corecore