84 research outputs found

    Guidance, Navigation and Control for UAV Close Formation Flight and Airborne Docking

    Get PDF
    Unmanned aerial vehicle (UAV) capability is currently limited by the amount of energy that can be stored onboard or the small amount that can be gathered from the environment. This has historically lead to large, expensive vehicles with considerable fuel capacity. Airborne docking, for aerial refueling, is a viable solution that has been proven through decades of implementation with manned aircraft, but had not been successfully tested or demonstrated with UAVs. The prohibitive challenge is the highly accurate and reliable relative positioning performance that is required to dock with a small target, in the air, amidst external disturbances. GNSS-based navigation systems are well suited for reliable absolute positioning, but fall short for accurate relative positioning. Direct, relative sensor measurements are precise, but can be unreliable in dynamic environments. This work proposes an experimentally verified guidance, navigation and control solution that enables a UAV to autonomously rendezvous and dock with a drogue that is being towed by another autonomous UAV. A nonlinear estimation framework uses precise air-to-air visual observations to correct onboard sensor measurements and produce an accurate relative state estimate. The state of the drogue is estimated using known geometric and inertial characteristics and air-to-air observations. Setpoint augmentation algorithms compensate for leader turn dynamics during formation flight, and drogue physical constraints during docking. Vision-aided close formation flight has been demonstrated over extended periods; as close as 4 m; in wind speeds in excess of 25 km/h; and at altitudes as low as 15 m. Docking flight tests achieved numerous airborne connections over multiple flights, including five successful docking manoeuvres in seven minutes of a single flight. To the best of our knowledge, these are the closest formation flights performed outdoors and the first UAV airborne docking

    Optical Tracking for Relative Positioning in Automated Aerial Refueling

    Get PDF
    An algorithm is designed to extract features from video of an air refueling tanker for use in determining the precise relative position of a receiver aircraft. The algorithm is based on receiving a known estimate of the tanker aircraft\u27s position and attitude. The algorithm then uses a known feature model of the tanker to predict the location of those features on a video frame. A corner detector is used to extract features from the video. The measured corners are then associated with known features and tracked from frame to frame. For each frame, the associated features are used to calculate three dimensional pointing vectors to the features of the tanker. These vectors are passed to a navigation algorithm which uses extended Kalman filters, as well as data-linked INS data to solve for the relative position of the tanker. The algorithms were tested using data from a flight test accomplished by the USAF Test Pilot School using a C-12C as a simulated tanker and a Learjet LJ-24 as the simulated receiver. The system was able to provide at least a dozen useful measurements per frame, with and without projection error

    Cooperative Virtual Sensor for Fault Detection and Identification in Multi-UAV Applications

    Get PDF
    This paper considers the problem of fault detection and identification (FDI) in applications carried out by a group of unmanned aerial vehicles (UAVs) with visual cameras. In many cases, the UAVs have cameras mounted onboard for other applications, and these cameras can be used as bearing-only sensors to estimate the relative orientation of another UAV. The idea is to exploit the redundant information provided by these sensors onboard each of the UAVs to increase safety and reliability, detecting faults on UAV internal sensors that cannot be detected by the UAVs themselves. Fault detection is based on the generation of residuals which compare the expected position of a UAV, considered as target, with the measurements taken by one or more UAVs acting as observers that are tracking the target UAV with their cameras. Depending on the available number of observers and the way they are used, a set of strategies and policies for fault detection are defined. When the target UAV is being visually tracked by two or more observers, it is possible to obtain an estimation of its 3D position that could replace damaged sensors. Accuracy and reliability of this vision-based cooperative virtual sensor (CVS) have been evaluated experimentally in a multivehicle indoor testbed with quadrotors, injecting faults on data to validate the proposed fault detection methods.Comisión Europea H2020 644271Comisión Europea FP7 288082Ministerio de Economia, Industria y Competitividad DPI2015-71524-RMinisterio de Economia, Industria y Competitividad DPI2014-5983-C2-1-RMinisterio de Educación, Cultura y Deporte FP

    Hybrid Testing of an Aerial Refuelling Drogue

    Get PDF

    Image Dependent Relative Formation Navigation for Autonomous Aerial Refueling

    Get PDF
    This research tests the feasibility, accuracy, and reliability of a predictive rendering and holistic comparison algorithm with use of an optical sensor to provide relative distance and position behind a lead or tanker aircraft. Using an accurate model of a tanker, an algorithm renders image(s) for comparison with actual collected images by a camera installed on the receiver aircraft. Based on this comparison, information used to create the rendered image(s) is used to provide the relative navigation solution required for autonomous air refueling. Given enough predicted images and processing time, this approach should reliably find an accurate solution. Building on previous work, this research aims to minimize the number of required rendered images to provide a real-time navigational solution with sufficient accuracy for an auto-pilot controller installed on future Unmanned Aircraft Systems

    Unmanned Aerial Systems: Research, Development, Education & Training at Embry-Riddle Aeronautical University

    Get PDF
    With technological breakthroughs in miniaturized aircraft-related components, including but not limited to communications, computer systems and sensors, state-of-the-art unmanned aerial systems (UAS) have become a reality. This fast-growing industry is anticipating and responding to a myriad of societal applications that will provide new and more cost-effective solutions that previous technologies could not, or will replace activities that involved humans in flight with associated risks. Embry-Riddle Aeronautical University has a long history of aviation-related research and education, and is heavily engaged in UAS activities. This document provides a summary of these activities, and is divided into two parts. The first part provides a brief summary of each of the various activities, while the second part lists the faculty associated with those activities. Within the first part of this document we have separated UAS activities into two broad areas: Engineering and Applications. Each of these broad areas is then further broken down into six sub-areas, which are listed in the Table of Contents. The second part lists the faculty, sorted by campus (Daytona Beach-D, Prescott-P and Worldwide-W) associated with the UAS activities. The UAS activities and the corresponding faculty are cross-referenced. We have chosen to provide very short summaries of the UAS activities rather than lengthy descriptions. If more information is desired, please contact me directly, or visit our research website (https://erau.edu/research), or contact the appropriate faculty member using their e-mail address provided at the end of this document

    Cooperative Sensor Fault Recovery in Multi-UAV Systems

    Get PDF
    IEEE International Conference on Robotics and Automation (ICRA), 16-21 May 2016 Stockholm, SwedenThis paper presents the design and experimental validation of a Fault Detection, Identification and Recovery (FDIR) system intended for multi-UAV applications. The system exploits the information provided by internal position, attitude and visual sensors onboard the UAVs of the fleet for detecting faults in the measurements of the position and attitude sensors of any of the member vehicles. Considering the observations provided by two or more UAVs in a cooperative way, it is possible to identify the source of the fault, but also implement a Cooperative Virtual Sensor (CVS) which provides a redundant position and velocity estimation of the faulty UAV that can be used for replacing its internal sensor. The vision-based FDIR system has been validated experimentally with quadrotors in an indoor testbed. In particular, fault detection and identification has been evaluated injecting a fault pattern offline on the position measurements, while the CVS has been applied in real time for the recovery phase.Ministerio de Educación Cultura y Deporte ICT-2011-28808

    Autonomous ground refuelling approach for civil aircrafts using computer vision and robotics

    Get PDF
    3D visual servoing systems need to detect the object and its pose in order to perform. As a result accurate, fast object detection and pose estimation play a vital role. Most visual servoing methods use low-level object detection and pose estimation algorithms. However, many approaches detect objects in 2D RGB sequences for servoing, which lacks reliability when estimating the object’s pose in 3D space. To cope with these problems, firstly, a joint feature extractor is employed to fuse the object’s 2D RGB image and 3D point cloud data. At this point, a novel method called PosEst is proposed to exploit the correlation between 2D and 3D features. Here are the results of the custom model using test data; precision: 0,9756, recall: 0.9876, F1 Score(beta=1): 0.9815, F1 Score(beta=2): 0.9779. The method used in this study can be easily implemented to 3D grasping and 3D tracking problems to make the solutions faster and more accurate. In a period where electric vehicles and autonomous systems are gradually becoming a part of our lives, this study offers a safer, more efficient and more comfortable environment

    2003 Research Engineering Annual Report

    Get PDF
    Selected research and technology activities at Dryden Flight Research Center are summarized. These activities exemplify the Center's varied and productive research efforts
    corecore