Mixed-reality for unmanned aerial vehicle operations in near earth environments

Abstract

Future applications will bring unmanned aerial vehicles (UAVs) to near Earth environments such as urban areas, causing a change in the way UAVs are currently operated. Of concern is that UAV accidents still occur at a much higher rate than the accident rate for commercial airliners. A number of these accidents can be attributed to a UAV pilot's low situation awareness (SA) due to the limitations of UAV operating interfaces. The main limitation is the physical separation between the vehicle and the pilot. This eliminates any motion and exteroceptive sensory feedback to the pilot. These limitation on top of a small eld of view from the onboard camera results in low SA, making near Earth operations di cult and dangerous. Autonomy has been proposed as a solution for near Earth tasks but state of the art arti cial intelligence still requires very structured and well de ned goals to allow safe autonomous operations. Therefore, there is a need to better train pilots to operate UAVs in near Earth environments and to augment their performance for increased safety and minimization of accidents.In this work, simulation software, motion platform technology, and UAV sensor suites were integrated to produce mixed-reality systems that address current limitations of UAV piloting interfaces. The mixed reality de nition is extended in this work to encompass not only the visual aspects but to also include a motion aspect. A training and evaluation system for UAV operations in near Earth environments was developed. Modi cations were made to ight simulator software to recreate current UAV operating modalities (internal and external). The training and evaluation system has been combined with Drexel's Sensor Integrated Systems Test Rig (SISTR) to allow simulated missions while incorporating real world environmental e ects andUAV sensor hardware.To address the lack of motion feedback to a UAV pilot, a system was developed that integrates a motion simulator into UAV operations. The system is designed such that during ight, the angular rate of a UAV is captured by an onboard inertial measurement unit (IMU) and is relayed to a pilot controlling the vehicle from inside the motion simulator.Efforts to further increase pilot SA led to the development of a mixed reality chase view piloting interface. Chase view is similar to a view of being towed behind the aircraft. It combines real world onboard camera images with a virtual representation of the vehicle and the surrounding operating environment. A series of UAV piloting experiments were performed using the training and evaluation systems described earlier. Subjects' behavioral performance while using the onboard camera view and the mixed reality chase view interface during missions was analyzed. Subjects' cognitive workload during missions was also assessed using subjective measures such as NASA task load index and non-subjective brain activity measurements using a functional Infrared Spectroscopy (fNIR) system. Behavioral analysis showed that the chase view interface improved pilot performance in near Earth ights and increased their situational awareness. fNIR analysis showed that a subjects cognitive workload was signi cantly less while using the chase view interface. Real world ight tests were conducted in a near Earth environment with buildings and obstacles to evaluate the chase view interface with real world data. The interface performed very well with real world, real time data in close range scenarios.The mixed reality approaches presented follow studies on human factors performance and cognitive loading. The resulting designs serve as test beds for studying UAV pilot performance, creating training programs, and developing tools to augment UAV operations and minimize UAV accidents during operations in near Earth environments.Ph.D., Mechanical Engineering -- Drexel University, 201

    Similar works