1,196 research outputs found

    Vision based strategies for implementing Sense and Avoid capabilities onboard Unmanned Aerial Systems

    Get PDF
    Current research activities are worked out to develop fully autonomous unmanned platform systems, provided with Sense and Avoid technologies in order to achieve the access to the National Airspace System (NAS), flying with manned airplanes. The TECVOl project is set in this framework, aiming at developing an autonomous prototypal Unmanned Aerial Vehicle which performs Detect Sense and Avoid functionalities, by means of an integrated sensors package, composed by a pulsed radar and four electro-optical cameras, two visible and two Infra-Red. This project is carried out by the Italian Aerospace Research Center in collaboration with the Department of Aerospace Engineering of the University of Naples “Federico II”, which has been involved in the developing of the Obstacle Detection and IDentification system. Thus, this thesis concerns the image processing technique customized for the Sense and Avoid applications in the TECVOL project, where the EO system has an auxiliary role to radar, which is the main sensor. In particular, the panchromatic camera performs the aiding function of object detection, in order to increase accuracy and data rate performance of radar system. Therefore, the thesis describes the implemented steps to evaluate the most suitable panchromatic camera image processing technique for our applications, the test strategies adopted to study its performance and the analysis conducted to optimize it in terms of false alarms, missed detections and detection range. Finally, results from the tests will be explained, and they will demonstrate that the Electro-Optical sensor is beneficial to the overall Detect Sense and Avoid system; in fact it is able to improve upon it, in terms of object detection and tracking performance

    Visual Detection of Small Unmanned Aircraft: Modeling the Limits of Human Pilots

    Get PDF
    The purpose of this study was to determine the key physical variables for visual detection of small, Unmanned Aircraft Systems (UAS), and to learn how these variables influence the ability of human pilots, in manned-aircraft operating between 60-knots to 160-knots in the airport terminal area, to see these small, unmanned aircraft in time to avoid a collision. The study also produced a set of probability curves for various operating scenarios, depicting the likelihood of visually detecting a small, unmanned aircraft in time to avoid colliding with it. The study used the known limits of human visual acuity, based on the mechanics of the human eye and previous research on human visual detection of distant objects, to define the human performance constraints for the visual search task. The results of the analysis suggest the probability of detection, in all cases modeled during the study, is far less than 50 percent. The probability of detection was well under 10 percent for small UAS aircraft similar to the products used by many recreational and hobby operators. The results of this study indicate the concept of see-and-avoid is not a reliable technique for collision prevention by manned-aircraft pilots when it comes to operating near small, unmanned aircraft. Since small, unmanned aircraft continue to appear in airspace where they do not belong, regulators and the industry need to accelerate the development and deployment of alternative methods for collision prevention between sUAS aircraft operations and manned-aircraft. The analysis effort for this study included the development of a new simulation model, building on existing models related to human visual detection of distant objects. This study extended existing research and used currently accepted standards to create a new model specifically tailored to small, unmanned aircraft detection. Since several input variables are not controllable, this study used a Monte Carlo simulation to provide a means for addressing the effects of uncertainty in the uncontrollable inputs that the previous models did not handle. The uncontrollable inputs include the airspeed and direction of flight for the unmanned aircraft, as well as the changing contrast between the unmanned aircraft target and its background as both the target aircraft and the observer encounter different background and lighting conditions. The reusable model created for this study will enable future research related to the visual detection of small, unmanned aircraft. It provides a new tool for studying the difficult task of visually detecting airborne, small, unmanned aircraft targets in time to maneuver clear of a possible collision with them. The study also tested alternative input values to the simulation model to explore how changes to small, unmanned aircraft features might improve the visual detectability of the unmanned aircraft by human pilots in manned aircraft. While these changes resulted in higher probabilities of detection, the overall detection probability remained very low thereby confirming the urgent need to build reliable collision avoidance capability into small UAS aircraft

    DragonflEYE: a passive approach to aerial collision sensing

    Get PDF
    "This dissertation describes the design, development and test of a passive wide-field optical aircraft collision sensing instrument titled 'DragonflEYE'. Such a ""sense-and-avoid"" instrument is desired for autonomous unmanned aerial systems operating in civilian airspace. The instrument was configured as a network of smart camera nodes and implemented using commercial, off-the-shelf components. An end-to-end imaging train model was developed and important figures of merit were derived. Transfer functions arising from intermediate mediums were discussed and their impact assessed. Multiple prototypes were developed. The expected performance of the instrument was iteratively evaluated on the prototypes, beginning with modeling activities followed by laboratory tests, ground tests and flight tests. A prototype was mounted on a Bell 205 helicopter for flight tests, with a Bell 206 helicopter acting as the target. Raw imagery was recorded alongside ancillary aircraft data, and stored for the offline assessment of performance. The ""range at first detection"" (R0), is presented as a robust measure of sensor performance, based on a suitably defined signal-to-noise ratio. The analysis treats target radiance fluctuations, ground clutter, atmospheric effects, platform motion and random noise elements. Under the measurement conditions, R0 exceeded flight crew acquisition ranges. Secondary figures of merit are also discussed, including time to impact, target size and growth, and the impact of resolution on detection range. The hardware was structured to facilitate a real-time hierarchical image-processing pipeline, with selected image processing techniques introduced. In particular, the height of an observed event above the horizon compensates for angular motion of the helicopter platform.

    SIGS: Synthetic Imagery Generating Software for the development and evaluation of vision-based sense-and-avoid systems

    Get PDF
    Unmanned Aerial Systems (UASs) have recently become a versatile platform for many civilian applications including inspection, surveillance and mapping. Sense-and-Avoid systems are essential for the autonomous safe operation of these systems in non-segregated airspaces. Vision-based Sense-and-Avoid systems are preferred to other alternatives as their price, physical dimensions and weight are more suitable for small and medium-sized UASs, but obtaining real flight imagery of potential collision scenarios is hard and dangerous, which complicates the development of Vision-based detection and tracking algorithms. For this purpose, user-friendly software for synthetic imagery generation has been developed, allowing to blend user-defined flight imagery of a simulated aircraft with real flight scenario images to produce realistic images with ground truth annotations. These are extremely useful for the development and benchmarking of Vision-based detection and tracking algorithms at a much lower cost and risk. An image processing algorithm has also been developed for automatic detection of the occlusions caused by certain parts of the UAV which carries the camera. The detected occlusions can later be used by our software to simulate the occlusions due to the UAV that would appear in a real flight with the same camera setup. Additionally this algorithm could be used to mask out pixels which do not contain relevant information of the scene for the visual detection, making the image search process more efficient. Finally an application example of the imagery obtained with our software for the benchmarking of a state-of-art visual tracker is presented

    Unmanned Aerial Systems: Research, Development, Education & Training at Embry-Riddle Aeronautical University

    Get PDF
    With technological breakthroughs in miniaturized aircraft-related components, including but not limited to communications, computer systems and sensors, state-of-the-art unmanned aerial systems (UAS) have become a reality. This fast-growing industry is anticipating and responding to a myriad of societal applications that will provide new and more cost-effective solutions that previous technologies could not, or will replace activities that involved humans in flight with associated risks. Embry-Riddle Aeronautical University has a long history of aviation-related research and education, and is heavily engaged in UAS activities. This document provides a summary of these activities, and is divided into two parts. The first part provides a brief summary of each of the various activities, while the second part lists the faculty associated with those activities. Within the first part of this document we have separated UAS activities into two broad areas: Engineering and Applications. Each of these broad areas is then further broken down into six sub-areas, which are listed in the Table of Contents. The second part lists the faculty, sorted by campus (Daytona Beach-D, Prescott-P and Worldwide-W) associated with the UAS activities. The UAS activities and the corresponding faculty are cross-referenced. We have chosen to provide very short summaries of the UAS activities rather than lengthy descriptions. If more information is desired, please contact me directly, or visit our research website (https://erau.edu/research), or contact the appropriate faculty member using their e-mail address provided at the end of this document

    Small UAS Detect and Avoid Requirements Necessary for Limited Beyond Visual Line of Sight (BVLOS) Operations

    Get PDF
    Potential small Unmanned Aircraft Systems (sUAS) beyond visual line of sight (BVLOS) operational scenarios/use cases and Detect And Avoid (DAA) approaches were collected through a number of industry wide data calls. Every 333 Exemption holder was solicited for this same information. Summary information from more than 5,000 exemption holders is documented, and the information received had varied level of detail but has given relevant experiential information to generalize use cases. A plan was developed and testing completed to assess Radio Line Of Sight (RLOS), a potential key limiting factors for safe BVLOS ops. Details of the equipment used, flight test area, test payload, and fixtures for testing at different altitudes is presented and the resulting comparison of a simplified mathematical model, an online modeling tool, and flight data are provided. An Operational Framework that defines the environment, conditions, constraints, and limitations under which the recommended requirements will enable sUAS operations BVLOS is presented. The framework includes strategies that can build upon Federal Aviation Administration (FAA) and industry actions that should result in an increase in BVLOS flights in the near term. Evaluating approaches to sUAS DAA was accomplished through five subtasks: literature review of pilot and ground observer see and avoid performance, survey of DAA criteria and recommended baseline performance, survey of existing/developing DAA technologies and performance, assessment of risks of selected DAA approaches, and flight testing. Pilot and ground observer see and avoid performance were evaluated through a literature review. Development of DAA criteria—the emphasis here being well clear— was accomplished through working with the Science And Research Panel (SARP) and through simulations of manned and unmanned aircraft interactions. Information regarding sUAS DAA approaches was collected through a literature review, requests for information, and direct interactions. These were analyzed through delineation of system type and definition of metrics and metric values. Risks associated with sUAS DAA systems were assessed by focusing on the Safety Risk Management (SRM) pillar of the SMS (Safety Management System) process. This effort (1) identified hazards related to the operation of sUAS in BVLOS, (2) offered a preliminary risk assessment considering existing controls, and (3) recommended additional controls and mitigations to further reduce risk to the lowest practical level. Finally, flight tests were conducted to collect preliminary data regarding well clear and DAA system hazards

    An Innovative Procedure for Calibration of Strapdown Electro-Optical Sensors Onboard Unmanned Air Vehicles

    Get PDF
    This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms)

    Autonomous Collision avoidance for Unmanned aerial systems

    Get PDF
    Unmanned Aerial System (UAS) applications are growing day by day and this will lead Unmanned Aerial Vehicle (UAV) in the close future to share the same airspace of manned aircraft.This implies the need for UAS to define precise safety standards compatible with operations standards for manned aviation. Among these standards the need for a Sense And Avoid (S&A) system to support and, when necessary, sub¬stitute the pilot in the detection and avoidance of hazardous situations (e.g. midair collision, controlled flight into terrain, flight path obstacles, and clouds). This thesis presents the work come out in the development of a S&A system taking into account collision risks scenarios with multiple moving and fixed threats. The conflict prediction is based on a straight projection of the threats state in the future. The approximations introduced by this approach have the advantage of high update frequency (1 Hz) of the estimated conflict geometry. This solution allows the algorithm to capture the trajectory changes of the threat or ownship. The resolution manoeuvre evaluation is based on a optimisation approach considering step command applied to the heading and altitude autopilots. The optimisation problem takes into account the UAV performances and aims to keep a predefined minimum separation distance between UAV and threats during the resolution manouvre. The Human-Machine Interface (HMI) of this algorithm is then embedded in a partial Ground Control Station (GCS) mock-up with some original concepts for the indication of the flight condition parameters and the indication of the resolution manoeuvre constraints. Simulations of the S&A algorithm in different critical scenarios are moreover in-cluded to show the algorithm capabilities. Finally, methodology and results of the tests and interviews with pilots regarding the proposed GCS partial layout are covered
    • …
    corecore