402 research outputs found

    Low-cost navigation and guidance systems for unmanned aerial vehicles - part 1: Vision-based and integrated sensors

    Get PDF
    In this paper we present a new low-cost navigation system designed for small size Unmanned Aerial Vehicles (UAVs) based on Vision-Based Navigation (VBN) and other avionics sensors. The main objective of our research was to design a compact, light and relatively inexpensive system capable of providing the Required Navigation Performance (RNP) in all phases of flight of a small UAV, with a special focus on precision approach and landing, where Vision Based Navigation (VBN) techniques can be fully exploited in a multisensor integrated architecture. Various existing techniques for VBN were compared and the Appearance-Based Approach (ABA) was selected for implementation. Feature extraction and optical flow techniques were employed to estimate flight parameters such as roll angle, pitch angle, deviation from the runway and body rates. Additionally, we addressed the possible synergies between VBN, Global Navigation Satellite System (GNSS) and MEMS-IMU (Micro-Electromechanical System Inertial Measurement Unit) sensors, as well as the aiding from Aircraft Dynamics Models (ADMs)

    Visual navigation and path tracking using street geometry information for image alignment and servoing

    Get PDF
    Single camera-based navigation systems need information from other sensors or from the work environment to produce reliable and accurate position measurements. Providing such trustable, accurate, and available information in the environment is very important. The work highlights that the availability of well-described streets in urban environments can be exploited by drones for navigation and path tracking purposes, thus benefitting from such structures is not limited to only automated driving cars. While the drone position is continuously computed using visual odometry, scene matching is used to correct the position drift depending on some landmarks. The drone path is defined by several waypoints, and landmarks centralized by those waypoints are carefully chosen in the street intersections. The known streets’ geometry and dimensions are used to estimate the image scale and orientation which are necessary for images alignment, to compensate for the visual odometry drift, and to pass closer to the landmark center by the visual servoing process. Probabilistic Hough transform is used to detect and extract the street borders. The system is realized in a simulation environment consisting of the Robot Operating System ROS, 3D dynamic simulator Gazebo, and IRIS drone model. The results prove the suggested system efficiency with a 1.4 m position RMS error

    Autonomous Target Tracking Of A Quadrotor UAV Using Monocular Visual-Inertial Odometry

    Get PDF
    Unmanned Aerial Vehicle (UAV) has been finding its ways into different applications. Hence, recent years witness extensive research towards achieving higher autonomy in UAV. Computer Vision (CV) algorithms replace Global Navigation Satellite System (GNSS), which is not reliable when the weather is bad, inside buildings or at secluded areas in performing real-time pose estimation. Thecontroller later uses the pose to navigate the UAV. This project presents a simulation of UAV, in MATLAB & SIMULINK, capable of autonomously detecting and tracking a designed visual marker. Referring to and improving the state-of-the-art CV algorithms, there is a newly formulated approach to detect the designed visual marker. The combination of data from the monocular camera with that from Inertial Measurement Unit (IMU) and sonar sensor enables the pose estimation of the UAV relative to the designed visual marker. A Proportional-Integral-Derivative (PID) controller later uses the pose of the UAV to navigate itself to be always following the target of interest

    Design and integration of vision based sensors for unmanned aerial vehicles navigation and guidance

    Get PDF
    In this paper we present a novel Navigation and Guidance System (NGS) for Unmanned Aerial Vehicles (UAVs) based on Vision Based Navigation (VBN) and other avionics sensors. The main objective of our research is to design a lowcost and low-weight/volume NGS capable of providing the required level of performance in all flight phases of modern small- to medium-size UAVs, with a special focus on automated precision approach and landing, where VBN techniques can be fully exploited in a multisensory integrated architecture. Various existing techniques for VBN are compared and the Appearance-based Navigation (ABN) approach is selected for implementation

    A low-cost vision based navigation system for small size unmanned aerial vehicle applications

    Get PDF
    Not availabl

    Low-cost vision sensors and integrated systems for unmanned aerial vehicle navigation

    Get PDF
    A novel low cost navigation system based on Vision Based Navigation (VBN) and other avionics sensors is presented, which is designed for small size Unmanned Aerial Vehicle (UAV) applications. The main objective of our research is to design a compact, light and relatively inexpensive system capable of providing the required navigation performance in all phases of flight of a small UAV, with a special focus on precision approach and landing, where Vision Based Navigation (VBN) techniques can be fully exploited in a multisensory integrated architecture. Various existing techniques for VBN are compared and the Appearance-based Navigation (ABN) approach is selected for implementation. Feature extraction and optical flow techniques are employed to estimate flight parameters such as roll angle, pitch angle, deviation from the runway and body rates. Additionally, we address the possible synergies between VBN, Global Navigation Satellite System (GNSS) and MEMS-IMU (Micro-Electromechanical System Inertial Measurement Unit) sensors and also the use of Aircraft Dynamics Models (ADMs) to provide additional information suitable to compensate for the shortcomings of VBN and MEMS-IMU sensors in high-dynamics attitude determination tasks. An Extended Kalman Filter (EKF) is developed to fuse the information provided by the different sensors and to provide estimates of position, velocity and attitude of the UAV platform in real-time. Two different integrated navigation system architectures are implemented. The first uses VBN at 20 Hz and GPS at 1 Hz to augment the MEMS-IMU running at 100 Hz. The second mode also includes the ADM (computations performed at 100 Hz) to provide augmentation of the attitude channel. Simulation of these two modes is performed in a significant portion of the AEROSONDE UAV operational flight envelope and performing a variety of representative manoeuvres (i.e., straight climb, level turning, turning descent and climb, straight descent, etc.). Simulation of the first integrated navigation system architecture (VBN/IMU/GPS) shows that the integrated system can reach position, velocity and attitude accuracies compatible with CAT-II precision approach requirements. Simulation of the second system architecture (VBN/IMU/GPS/ADM) also shows promising results since the achieved attitude accuracy is higher using the ADM/VBS/IMU than using VBS/IMU only. However, due to rapid divergence of the ADM virtual sensor, there is a need for frequent re-initialisation of the ADM data module, which is strongly dependent on the UAV flight dynamics and the specific manoeuvring transitions performed

    A Vision-Based Algorithm for UAV State Estimation During Vehicle Recovery

    Get PDF
    A computer vision-based algorithm for Unmanned Aerial Vehicle state estimation during vehicle recovery is presented. The algorithm is intended to be used to augment or back up Global Positioning System as the primary means of navigation during vehicle recovery for UAVs. The method requires a clearly visible recovery target with markers placed on the corners in addition to known target geometry. The algorithm uses clustering techniques to identify the markers, a Canny Edge detector and a Hough Transform to verify these markers actually lie on the recovery target, an optimizer to match the detected markers with coordinates in three-space, a non-linear transformation and projection solver to observe the position and orientation of the camera, and an Extended Kalman Filter (EKF) to improve the tracking of the state estimate. While it must be acknowledged that the resolution of the test images used is much higher than the resolution of images used in previous algorithms and that the images used to test this algorithm are either synthetic or taken in static conditions, the algorithm presented does give much better state estimates than previously-developed vision systems

    Vision-Based Autonomous Landing of a Quadrotor on the Perturbed Deck of an Unmanned Surface Vehicle

    Get PDF
    Autonomous landing on the deck of an unmanned surface vehicle (USV) is still a major challenge for unmanned aerial vehicles (UAVs). In this paper, a fiducial marker is located on the platform so as to facilitate the task since it is possible to retrieve its six-degrees of freedom relative-pose in an easy way. To compensate interruption in the marker’s observations, an extended Kalman filter (EKF) estimates the current USV’s position with reference to the last known position. Validation experiments have been performed in a simulated environment under various marine conditions. The results confirmed that the EKF provides estimates accurate enough to direct the UAV in proximity of the autonomous vessel such that the marker becomes visible again. Using only the odometry and the inertial measurements for the estimation, this method is found to be applicable even under adverse weather conditions in the absence of the global positioning system
    corecore