17 research outputs found

    The Effects of Remotely Piloted Aircraft Command and Control Latency during Within-Visual-Range Air-To-Air Combat

    Get PDF
    The type of military missions conducted by remotely piloted aircraft continues to expand into all facets of operations including air-to-air combat. While future within-visual-range air-to-air combat will be piloted by artificial intelligence, remotely piloted aircraft will likely first see combat. The purpose of this study was to quantify the effect of latency on one-versus-one, within-visual-range air-to-air combat success during both high-speed and low-speed engagements. The research employed a repeated-measures experimental design to test the various hypothesis associated with command and control latency. Participants experienced in air-to-air combat were subjected to various latency inputs during one-versus-one simulated combat using a virtual-reality simulator and scored on the combat success of each engagement. This research was pursued in coordination with the Air Force Research Laboratory and the United States Air Force Warfare Center. The dependent variable, combat score, was derived through post-simulation analysis and scored for each engagement. The independent variables included the input control latency (time) and the starting velocity of the engagement (high-speed and low-speed). The input latency included six different delays (0.0, 0.25, 0.50, 0.75, 1.0, and 1.25 seconds) between pilot input and simulator response. Each latency was repeated for a high-speed and low-speed engagement. A two-way repeated-measures analysis of variance was used to determine whether there was a statistically significant difference in means between the various treatments on combat success and determine if there was an interaction between latency and fight speed. The results indicated that there was a statistically significant difference between combat success at the various latency levels and engagement velocity. There was a significant interaction effect between latency and engagement speed, indicating that the outcome was dependent on both variables. As the latency increased, a significant decrease in combat success occurred, decreasing from .539 with no latency, to .133 at 1.250 seconds of latency during high-speed combat. During low-speed combat, the combat success decreased from .659 with no latency, to .189 at 1.250 seconds of latency. The largest incremental decrease occurred between 1.00 and 1.25 seconds of latency for high-speed and between 0.75 and 1.00 at low-speed. The overall decrease in combat success during a high-speed engagement was less than during the low-speed engagements. The results of this study quantified the decrease in combat success during within-visual range air-to-air combat and concluded that, when latency is encountered, a high-speed (two-circle) engagement is desired to minimize adverse latency effects. The research informs aircraft and communication designers of the decrease in expected combat success caused by latency. This simulation configuration can be utilized for future research leading to methods and tactics to decrease the effects of latency

    Safety Management System Implementation Planning

    Get PDF

    Small Unmanned Aircraft Systems Flight Training Programs Through the Lens of a Traditional Flight Training University

    Get PDF
    According to CFR 14 Part 107, an RPC is issued strictly on the results of the Federal Aviation Administration’s (FAA) knowledge test and background security check. This leaves commercial sUAS operators on their own to determine the training and proficiency required to fly safely and effectively. Regardless of the learning method, the question becomes, what knowledge and maneuvers should be learned, and to what proficiency level? This paper explores specific knowledge and skills which should be mandatory for all sUAS commercial pilots and recommends a probation period to gain experience before receiving an unrestricted RPC

    Remote Pilot Situational Awareness with Augmented Reality Glasses: An Observational Field Study

    Get PDF
    With the use of small unmanned aerial systems (sUAS) proliferating throughout industry and public safety, it is imperative to ensure safety of flight. The Federal Aviation Administration (FAA) published regulations for commercial use of sUAS in 2016 and included the requirement to maintain visual line-of-sight with the aircraft at all times. However, due to the nature of the sUAS ground control stations (GCS), remote pilots time-share between observing the aircraft and interacting with the display on the GCS. Time-sharing between the aircraft and GCS can be seen as similar to the cross-check a pilot uses when flying a manned aircraft. While manned aircraft designers have invested in the ergonomics and understanding of the cognitive process to optimize situational awareness, it has not been a design requirement for sUAS. The result is that the unmanned operator must change head orientation, eye focus, and remove the aircraft from peripheral vision during the cross-check between the GCS and aircraft. This, coupled with the limited field of view of the sUAS GCS displayed camera, leads to loss of situational awareness through task saturation, and misprioritization. Mixed reality, virtual reality, and augmented reality visual devices are being adopted in the gaming and technical world. The application of these devices to the sUAS GCS could mitigate some of the degradation of situational awareness. Specifically, the incorporation of augmented reality devices where a synthetic display is overlaid on the real-world, allows the remote pilot to observe the aircraft, manipulate the camera, and interact with the GCS without changing head position. This participant observational study evaluated the difference between the remote pilot cross-check while flying with a typical GCS display and when flying with an augmented reality headset in a field setting. The results indicate a significant difference between the pilot’s crosscheck when using augmented reality glasses allowing the pilot to maintain the aircraft in their field of view 56.7% of the time compared to 20.5% when not using the glasses

    Standardization of Human-Computer-Interface for Geo-Fencing in Small Unmanned Aircraft Systems

    Get PDF
    The use of small unmanned aircraft systems (sUAS) has increased significantly in the past year. Geographic fencing (geo-fencing) is software built into most medium-cost consumer sUAS. This software is typically used to limit the altitude above launch point, the flight distance from the transmitting controller, and/or restrict flight inside a no-fly zone. While the concept of a geo-fence is simplistic, the human-computer-interface (HCI) varies drastically among platforms, and even between software iterations on the same platform. This research examines the HCI of three popular consumer-level sUAS with regard to geo-fencing. The software procedures and human interface for the DJI Inspire-1, 3D Robotics IRIS+, and Yuneec Typhoon Q500+ were evaluated through review of relevant literature, software, and flight-testing. This assessment yielded several recommendations for geo-fencing software for sUAS

    Integrating Virtual Reality into the Asynchronous Learning Environment

    Get PDF
    4 presentations Covers a spectrum of subject matter Real to virtual and virtual to real Want to highlight What the goal was How it was integrated into effective learning experience

    Influencing Factors for Use of Unmanned Aerial Systems in Support of Aviation Accident and Emergency Response

    Get PDF
    The purpose of this research paper was to examine the influencing factors associated with the use of unmanned aerial system (UAS) technology to support aviation accident and emergency response. The ability of first responders to react to an emergency is dependent on the quality, accuracy, timeliness, and usability of information. With aviation accidents such as the Asiana Airlines Flight 214 crash at San Francisco International Airport, the ability to sense and communicate the location of victims may reduce the potential for accidental passenger death. Furthermore, the ability to obtain information en-route to an accident may also to assist to reduce overall response and coordination time of first responders (e.g., Aviation Rescue and Firefighting [ARFF]). By identifying and examining current and potential practices, capabilities, and technology (e.g., human-machine-interface [HMI], human factors, tools, and capability modifiers) a more comprehensive model of the influencing factors is established to further support the growing body of knowledge (i.e., safety, human computer interaction, human-robot systems, socio-economical systems, service and public sector systems, and technological forecasting). A series of recommendations regarding the technology and application are provided to support future development or adaptation of regulations, policies, or future research. --from the article

    Small Unmanned Aircraft Systems Acoustic Analysis for Noninvasive Marine Mammal Response: An Exploratory Field Study

    Get PDF
    As in countless other fields of human endeavor, small unmanned aircraft systems (sUAS) have the potential to benefit pinniped (Pinnipedia; e.g., Phocidae [seals], Otariidae [sea lions], and Odobenidae [walruses]) response efforts. The employment of sUAS could give responders a close-up look at animals in distress in order to determine their condition as well as develop a response strategy. However, unlike other subjects that are regularly inspected by sUAS (e.g., croplands and civil infrastructure) pinnipeds may respond to the distinctive sound generated by small, multirotor sUAS. This reaction may include retreating into the water en masse, which could put the target individual out of reach of the response team. To potentially prevent this outcome, this exploratory field study established sUAS acoustic profiles through quantitative and qualitative measures for multiple aircraft across a range of distance and altitude. These data were collected in both a secluded rural environment and a coastal environment. The results indicate that sUAS sound pressure levels at least 20 dBA (re 20 µPa) below the ambient noise floor are required to completely mask the distinctive sound of the aircraft to human hearing. The results were used to create aircraft operational envelopes to potentially mitigate disturbance while optimizing visual information. To reflect the type of sUAS that would likely be available to small, non-profit marine mammal response groups working in remote locations, the aircraft studied were limited to compact models $3,000 or less

    Accuracy Assessment of the eBee Using RTK and PPK Corrections Methods as a Function of Distance to a GNSS Base Station

    Get PDF
    The use of unmanned aircraft systems to collect data for photogrammetry models has grown significantly in recent years. The accuracy of a photogrammetric model can depend on image georeferencing. The distance from a reference base station can affect the accuracy of the results. Positioning corrections data relies on precise timing measurements of satellite signals. The signals travel through the Earth\u27s atmosphere, which introduces errors due to ionospheric and tropospheric delays. The aim of this research was to examine the eBee X and its global GNSS accuracy by comparing the RTK and PPK methods at different base station distances in photogrammetry models. Three factors were compared: 1) RTK and PPK methods, 2) local GNSS receiver via caster and NTRIP service corrections sources, and 3) base station distances between 2.4 km and 42.0 km. The eBee X flights occurred in 2023, at three different flying sites in Southwest Arizona in the United States. The RMSEXYZ values from eight Check Points at each of three flying sites were measured with traditional GNSS survey methods. Through ANOVA testing, there were no statistical differences in RMSEXYZ accuracy between RTK and PPK methods as well as between using a local Reach RS2 GNSS receiver via caster and NTRIP service for the eBee X; however, there was a statistical difference in RMSEXYZ accuracy between base station distances of 2.4 km to 42.0 km, whereas, F(5, 33) = 11.99, p = 0.000. Specifically, base station distances of less than 16.2 km were significantly less than larger distances up to 42.0 km. These data suggest there was a significant difference in total accuracy based on the distance from the GNSS receiver base station providing corrections for the eBee X

    Using Unmanned Aircraft Systems to Investigate the Detectability of Burmese Pythons in South Florida

    Get PDF
    Burmese pythons are an invasive, non-native species of snake to southern Florida and attempts at eradicating the snakes had yielded mixed results. The current rate of detection had been reported as 0.05%. The purpose of this research project was to determine if a UAS equipped with a near-infrared (NIR) camera could be used to detect pythons at a higher rate when compared to a RGB camera. The approach involved collecting 55 images from RGB and NIR cameras, over carcass pythons at flying heights of 3, 6, 9, 12, and 15 meters. A likelihood ratio consisting of a true positive rate over false positive rate was calculated from 101 participant survey responses. Participants were able to detect pythons from an NIR camera with greater likelihood (M = 6.05, SD = 1.94) than a RGB camera (M = 4.74, SD = 1.32), t(10) = 1.77, p = .048. The data suggests that survey participants could correctly detect pythons in images containing the pythons at a 1.3x greater rate with the NIR sensor than with the RGB sensor. Also, a true positive rate (TPR) showed the observation rate of correctly detecting a python when one was present in the image. The NIR camera images had higher TPR rates compared to RGB images. The largest difference between camera types was observed at the 15 meters flying height over an outstretched python; there was a 35% increase in participant detection accuracy using the NIR camera compared to the RGB camera. These results suggest that a UAS equipped with an NIR camera flying between 3 and 15 meters in a nadir-oriented position of 90 degrees can be used to detect pythons. Using a UAS equipped with an NIR camera over levees searching for exposed pythons may help biologists responsible for managing these invasive species determine if a python is present
    corecore