1,814 research outputs found

    Airborne Radar for sUAS Sense and Avoid

    Get PDF
    A primary challenge for the safe integration of small UAS operations into the National Airspace System (NAS) is traffic deconfliction, both from manned and unmanned aircraft. The UAS Traffic Management (UTM) project being conducted at the National Aeronautics and Space Administration (NASA) considers a layered approach to separation provision, ranging from segregation of operations through airspace volumes (geofences) to autonomous sense and avoid (SAA) technologies for higher risk, densely occupied airspace. Cooperative SAA systems, such as Automatic Dependent Surveillance-Broadcast (ADS-B) and/or vehicle-to-vehicle communication systems provide significant additional risk mitigation but they fail to adequately mitigate collision risks for non-cooperative (non-transponder equipped) airborne aircraft. The RAAVIN (Radar on Autonomous Aircraft to Verify ICAROUS Navigation) flight test being conducted by NASA and the Mid-Atlantic Aviation Partnership (MAAP) was designed to investigate the applicability and performance of a prototype, commercially available sUAS radar to detect and track non-cooperative airborne traffic, both manned and unmanned. The radar selected for this research was a Frequency Modulated Continuous Wave (FMCW) radar with 120 degree azimuth and 80 degree elevation field of view operating at 24.55GHz center frequency with a 200 MHz bandwidth. The radar transmits 2 watts of power thru a Metamaterial Electronically Scanning Array antenna in horizontal polarization. When the radar is transmitting, personnel must be at least 1 meter away from the active array to limit nonionizing radiation exposure. The radar physical dimensions are 18.7cm by 12.1cm by 4.1cm and it weighs less than 820 grams making it well suited for installation on small UASs. The onboard, SAA capability, known as ICAROUS, (Independent Configurable Architecture for Reliable Operations of Unmanned Systems), developed by NASA to support sUAS operations, will provide autonomous guidance using the traffic radar tracks from the onboard radar. The RAAVIN set of studies will be conducted in three phases. The first phase included outdoor, ground-based radar evaluations performed at the Virginia Techs Kentland Farm testing range in Blacksburg, VA. The test was designed to measure how well the radar could detect and track a small UAS flying in the radars field of view. The radar was used to monitor 5 test flights consisting of outbound, inbound and crossing routes at different ranges and altitudes. The UAS flown during the ground test was the Inspire 2, a quad copter weighing less than 4250 grams (10 pounds) at maximum payload. The radar was set up to scan and track targets over its full azimuthal field of view from 0 to 40 degrees in elevation. The radar was configured to eliminate tracks generated from any targets located beyond 2000 meters from the radar and moving at velocities under 1.45 meters per second. For subsequent phases of the study the radar will be integrated with a sUAS platform to evaluate its performance in flight for SAA applications ranging from sUAS to manned GA aircraft detections and tracking. Preliminary data analysis from the first outdoor ground tests showed the radar performed well at tracking the vehicle as it flew outbound and repeatedly maintained a track out to 1000 meters (maximum 1387 meters) until the vehicle slowed to a stop to reverse direction to fly inbound. As the Inspire flew inbound tracks from beyond 800 meters, a reacquisition time delay was consistently observed between when the Inspire exceeds a speed of 1.45 meters per second and when the radar indicated an inbound target was present and maintained its track. The time delay varied between 6 seconds to over 37 seconds for the inbound flights examined, and typically resulted in about a 200 meter closure distance before the Inspire track was maintained. The radar performed well at both acquiring and tracking the vehicle as it flew crossing routes out past 400 meters across the azimuthal field of view. The radar and ICAROUS software will be integrated and flown on a BFD-1400-SE8-E UAS during the next phase of the RAAVIN project. The main goal at the conclusion of this effort is to determine if this radar technology can reliably support minimum requirements for SAA applications of sUAS. In particular, the study will measure the range of vehicle detections, lateral and vertical angular errors, false and missed/late detections, and estimated distance at closest point of approach after an avoidance maneuver is executed. This last metric is directly impacted by sensor performance and indicates its suitability for the task

    Recommendations for Sense and Avoid Policy Compliance

    Get PDF
    Since unmanned aircraft do not have a human on board, they need to have a sense and avoid capability that provides an "equivalent level of safety" (ELOS) to manned aircraft. The question then becomes - is sense and avoid ELOS for unmanned aircraft adequate to satisfy the requirements of 14 CFR 91.113? Access 5 has proposed a definition of sense and avoid, but the question remains as to whether any sense and avoid system can comply with 14 CFR 91.113 as currently written. The Access 5 definition of sense and avoid ELOS allows for the development of a sense and avoid system for unmanned aircraft that would comply with 14 CFR 91.113. Compliance is based on sensing and avoiding other traffic at an equivalent level of safety for collision avoidance, as manned aircraft. No changes to Part 91 are necessary, with the possible exception of changing "see" to "sense," or obtaining an interpretation from the FAA General Counsel that "sense" is equivalent to "see.

    Below Horizon Aircraft Detection Using Deep Learning for Vision-Based Sense and Avoid

    Full text link
    Commercial operation of unmanned aerial vehicles (UAVs) would benefit from an onboard ability to sense and avoid (SAA) potential mid-air collision threats. In this paper we present a new approach for detection of aircraft below the horizon. We address some of the challenges faced by existing vision-based SAA methods such as detecting stationary aircraft (that have no relative motion to the background), rejecting moving ground vehicles, and simultaneous detection of multiple aircraft. We propose a multi-stage, vision-based aircraft detection system which utilises deep learning to produce candidate aircraft that we track over time. We evaluate the performance of our proposed system on real flight data where we demonstrate detection ranges comparable to the state of the art with the additional capability of detecting stationary aircraft, rejecting moving ground vehicles, and tracking multiple aircraft

    Multirotor UAS Sense and Avoid with Sensor Fusion

    Get PDF
    In this thesis, the key concepts of independent autonomous Unmanned Aircraft Systems (UAS) are explored including obstacle detection, dynamic obstacle state estimation, and avoidance strategy. This area is explored in pursuit of determining the viability of UAS Sense and Avoid (SAA) in static and dynamic operational environments. This exploration is driven by dynamic simulation and post-processing of real-world data. A sensor suite comprised of a 3D Light Detection and Ranging (LIDAR) sensor, visual camera, and 9 Degree of Freedom (DOF) Inertial Measurement Unit (IMU) was found to be beneficial to autonomous UAS SAA in urban environments. Promising results are based on to the broadening of available information about a dynamic or fixed obstacle via pixel-level LIDAR point cloud fusion and the combination of inertial measurements and LIDAR point clouds for localization purposes. However, there is still a significant amount of development required to optimize a data fusion method and SAA guidance method

    Superpixel-based statistical anomaly detection for sense and avoid

    Get PDF

    A Learning-Based Guidance Selection Mechanism for a Formally Verified Sense and Avoid Algorithm

    Get PDF
    This paper describes a learning-based strategy for selecting conflict avoidance maneuvers for autonomous unmanned aircraft systems. The selected maneuvers are provided by a formally verified algorithm and they are guaranteed to solve any impending conflict under general assumptions about aircraft dynamics. The decision-making logic that selects the appropriate maneuvers is encoded in a stochastic policy encapsulated as a neural network. The networks parameters are optimized to maximize a reward function. The reward function penalizes loss of separation with other aircraft while rewarding resolutions that result in minimum excursions from the nominal flight plan. This paper provides a description of the technique and presents preliminary simulation results

    Sense and Avoid for Small Unmanned Aircraft Systems

    Get PDF
    The ability for small Unmanned Aircraft Systems (sUAS) to safely operate beyond line of sight is of great interest to consumers, businesses, and scientific research. In this work, we investigate Sense and Avoid (SAA) algorithms for sUAS encounters using three 4k cameras for separation distances between 200m and 2000m. Video is recorded of different sUAS platforms designed to appear similar to expected air traffic, under varying weather conditions and flight encounter scenarios. University partners and NASA both developed SAA methods presented in this report

    Intelligent UAS Sense-and-Avoid Utilizing Global Constraints

    Get PDF
    Sense-and-avoid (SAA) is a critical research topic for enabling the operation of Unmanned Aircraft Systems (UAS) in civilian airspace. SAA involves two planning related problems: 1) plan-recognition to predict the future trajectory of nearby aircraft, and 2) path planning to avoid conflicts with nearby aircraft that pose a threat. We have designed and built components of a novel intelligent sense-and-avoid (iSAA) reasoning framework that takes into account information about aircraft type, transponder code, communications, local routes, airports, airspace, terrain, and weather to more accurately predict near- and medium-term trajectories of nearby aircraft. By using this additional information both the on-board control software and the ground-based UAS operator can make more informed, intelligent decisions to effectively predict and avoid conflicts and maintain separation. While this capability benefits all categories of UASs operating under both Instrument Flight Rules (IFR) and Visual Flight Rules (VFR), it is absolutely essential for allowing smaller UASs to operate VFR at low altitude in uncontrolled airspace for operations such as survey work, wildlife tracking, aerial photography, utilities inspection, crop dusting, and package delivery
    corecore