9 research outputs found

    Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning

    Get PDF
    Motivation: Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners 1. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc. [2]

    Towards Pilot-Aware Cockpits

    No full text

    Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning

    No full text
    Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc

    The effect of flight phase on electrodermal activity and gaze behavior: A simulator study

    No full text
    Current advances in airplane cockpit design and layout are often driven by a need to improve the pilot's awareness of the aircraft's state. This involves an improvement in the flow of information from aircraft to pilot. However, providing the aircraft with information on the pilot's state remains an open challenge. This work takes a first step towards determining the pilot's state based on biosensor data. We conducted a simulator study to record participants' electrodermal activity and gaze behavior, indicating pilot state changes during three distinct flight phases in an instrument failure scenario. The results show a significant difference in these psychophysiological measures between a phase of regular flight, the incident phase, and a phase with an additional troubleshooting task after the failure. The differences in the observed measures suggest great potential for a pilot-aware cockpit that can provide assistance based on the sensed pilot state.ISSN:0003-687

    Evaluating and Comparing Airspace Structure Visualisation and Perception on Digital Aeronautical Charts

    No full text
    Given the challenge of visualising 3D space on a 2D map, maps used for in-flight navigation by pilots should be designed especially carefully. This paper studies, based on existing aeronautical charts, the visualisation, interaction, and interpretation of airspace structures with aviation infrastructure and the base map. We first developed a three-tiered evaluation grid for a cartographic analysis of existing aeronautical charts. Subsequently, we evaluated four countries’ maps based on our evaluation grid. To validate our analysis, we conducted a user study with 27 pilots, the users of aeronautical charts. The results of our cartographic analysis show that aeronautical charts produced by different countries all fulfil the need of pilots being able to orient themselves. According to our evaluation, the Swiss aeronautical chart scored slightly more favourably than the other evaluated charts for effective map-reading. These findings were confirmed in the results of the user study. The major contribution of this work is the evaluation grid for the cartographic analysis. With its different layers, adaptable main- and sub-topics, it can be used to compare and improve the design not only of aeronautical charts, but for a broad spectrum of thematic maps

    FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

    No full text
    Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.ISSN:1044-7318ISSN:1532-759

    Real time eye gaze tracking for human machine interaction in the cockpit

    No full text
    The Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot’s workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes. The angular gaze deviation goes down to less than 1° for the panels towards which an accurate eye gaze was required according to the use cases.ISSN:0277-786
    corecore