3 research outputs found

    Effects of Visual Interaction Methods on Simulated Unmanned Aircraft Operator Situational Awareness

    Get PDF
    The limited field of view of static egocentric visual displays employed in unmanned aircraft controls introduces the soda straw effect on operators, which significantly affects their ability to capture and maintain situational awareness by not depicting peripheral visual data. The problem with insufficient operator situational awareness is the resulting increased potential for error and oversight during operation of unmanned aircraft, leading to accidents and mishaps costing United States taxpayers between 4millionto4 million to 54 million per year. The purpose of this quantitative experimental completely randomized design study was to examine and compare use of dynamic eyepoint to static visual interaction in a simulated stationary egocentric environment to determine which, if any, resulted in higher situational awareness. The theoretical framework for the study established the premise that the amount of visual information available could affect the situational awareness of an operator and that increasing visual information through dynamic eyepoint manipulation may result in higher situational awareness than static visualization. Four experimental dynamic visual interaction methods were examined (analog joystick, head tracker, uninterrupted hat/point of view switch, and incremental hat/point of view switch) and compared to a single static method (the control treatment). The five methods were used in experimental testing with 150 participants to determine if the use of a dynamic eyepoint significantly increased the situational awareness of a user within a stationary egocentric environment, indicating that employing dynamic control would reduce the occurrence or consequences of the soda straw effect. The primary difference between the four dynamic visual interaction methods was their unique manipulation approaches to control the pitch and yaw of the simulated eyepoint. The identification of dynamic visual interaction increasing user SA may lead to the further refinement of human-machine-interface (HMI), teleoperation, and unmanned aircraft control principles, with the pursuit and performance of related research

    A Remote Test Pilot Control Station for Unmanned Research Aircraft

    Get PDF
    First-person-view ground control stations are an alternative to overcome the drawbacks of an external remote pilot with direct visual line of sight during flight-testing of unmanned aircraft systems. In this paper, a remote test pilot control station with first-person-view for advanced flight-testing is presented. The remote test pilot control station is developed for the German Aerospace Center's ALAADy (Automated Low Altitude Air Delivery) demonstrator aircraft, a gyroplane with a maximum take-off mass of 450 kg. The paper focusses on the system design of the remote test pilot control station, which has to overcome three major challenges: fault tolerance and reliability of the system, the pilot's situational and spatial awareness and latency. The remote test pilot control station is evaluated by pilot-in-the-loop simulations within a dedicated simulation environment. Objective performance criteria as well as subjective pilot ratings based on the Cooper-Harper rating scale are used to assess the control station for the ALAADy-demonstrator in direct mode and flight controller assisted mode. The simulation results show that pilots with experience in manned gyroplanes can consistently control the ALAADy demonstrator with the remote test pilot control station in ideal windless conditions. However, in more challenging crosswind conditions, pilot induced oscillations can be observed in direct mode
    corecore