26 research outputs found

    Tightly Integrating Optical and Inertial Sensors for Navigation Using the UKF

    Get PDF
    The motivation of this research is to address the benefits of tightly integrating optical and inertial sensors where GNSS signals are not available. The research begins with describing the navigation problem. Then, error and measurement models are presented. Given a set of features, a feature detection and projection algorithm is developed which utilizes inertial measurements to predict vectors in the feature space between images. The unscented Kalman filter is applied to the navigation system using the inertial measurements and feature matches to estimate the navigation trajectory. Finally, the image-aided navigation algorithm is tested using a simulation and an experiment. As a result, the optical measurements combined with the inertial sensors result in improved performance for non-GNSS based navigation

    Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

    No full text
    International audienceFor the last few decades, growing interest has returned to the quite chal-lenging task of the autonomous lunar landing. Soft landing of payloads on the lu-nar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an au-topilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accu-rately estimate the orientation of the velocity vector which is mandatory to control the lander's pitch in a quasi-optimal way with respect to the fuel consumption. Sec-ondly a new low-speed Visual Motion Sensor (VMS) inspired by insects' visual systems performing local angular 1-D speed measurements ranging from 1.5 • /s to 25 • /s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These pre-liminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow

    3D-Odometry for rough terrain - Towards real 3D navigation

    Get PDF
    Up to recently autonomous mobile robots were mostly designed to run within an indoor, yet partly structured and flat, environment. In rough terrain many problems arise and position tracking becomes more difficult. The robot has to deal with wheel slippage and large orientation changes. In this paper we will first present the recent developments on the off-road rover Shrimp. Then a new method, called 3D-Odometry, which extends the standard 2D odometry to the 3D space will be developed. Since it accounts for transitions, the 3D-Odometry provides better position estimates. It will certainly help to go towards real 3D navigation for outdoor robots

    INDOOR POSITIONING BY VISUAL-INERTIAL ODOMETRY

    Get PDF
    Indoor positioning is a fundamental requirement of many indoor location-based services and applications. In this paper, we explore the potential of low-cost and widely available visual and inertial sensors for indoor positioning. We describe the Visual-Inertial Odometry (VIO) approach and propose a measurement model for omnidirectional visual-inertial odometry (OVIO). The results of experiments in two simulated indoor environments show that the OVIO approach outperforms VIO and achieves a positioning accuracy of 1.1 % of the trajectory length

    Passive GPS-Free Navigation for Small UAVs

    Full text link
    Abstract — A method for passive GPS-free navigation of a small Unmanned Aerial Vehicle with a minimal sensor suite (limited to an inertial measurement unit and a monocular camera) is presented. The navigation task is cast as a Simul-taneous Localization and Mapping (SLAM) problem. While SLAM has been the subject of a great deal of research, the highly non-linear system dynamics and limited sensor suite available in this application presents a unique set of chal-lenges which have not previously been addressed. In this particular application solutions based on Extended Kalman Filters have been shown to diverge and alternate techniques are required. In this paper an Unscented Kalman Filter is applied to the navigation problem, which leads to a consistent estimate of vehicle and feature states. This paper presents: (a) simulatio

    One Approach to the Fusion of Inertial Navigation and Dynamic Vision

    Get PDF

    Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM Matching

    Full text link
    Visual navigation for planetary landing vehicles shows many scientific and technical challenges due to inclined and rather high velocity approach trajectories, complex 3D environment and high computational requirements for real-time image processing. High relative navigation accuracy at landing site is required for obstacle avoidance and operational constraints. The current paper discusses detailed performance analysis results for a recently published concept of a visual navigation system, based on a mono camera as vision sensor and matching of the recovered and reference 3D models of the landing site. The recovered 3D models are being produced by real-time, instantaneous optical flow processing of the navigation camera images. An embedded optical correlator is introduced, which allows a robust and ultra high-speed optical flow processing under different and even unfavorable illumination conditions. The performance analysis is based on a detailed software simulation model of the visual navigation system, including the optical correlator as the key component for ultra-high speed image processing. The paper recalls the general structure of the navigation system and presents detailed end-to-end visual navigation performance results for a Mercury landing reference mission in terms of different visual navigation entry conditions, reference DEM resolution, navigation camera configuration and auxiliary sensor information. I

    A low-cost vision based navigation system for small size unmanned aerial vehicle applications

    Get PDF
    Not availabl

    Vision-Aided Inertial Navigation

    Get PDF
    This document discloses, among other things, a system and method for implementing an algorithm to determine pose, velocity, acceleration or other navigation information using feature tracking data. The algorithm has computational complexity that is linear with the number of features tracked
    corecore