27 research outputs found

    Which Ocular Dominance Should Be Considered for Monocular Augmented Reality Devices?

    Get PDF
    A monocular augmented reality device allows the user to see information that is superimposed on the environment. As it does not stimulate both eyes in the same way, it creates a phenomenon known as binocular rivalry. The question therefore arises as to whether monocular information should be displayed to a particular eye and if an ocular dominance test can determine it. This paper contributes to give a better understanding of ocular dominance by comparing nine tests. Our results suggest that ocular dominance can be divided into sighting and sensorial dominance. However, different sensorial dominance tests give different results, suggesting that it is composed of distinct components that are assessed by different tests. There is a need for a comprehensive test that can consider all of these components, in order to identify on which eye monocular information should be directed to when using monocular augmented reality devices

    Challenges and considerations for in-flight monitoring of pilots and crews

    Get PDF
    Human functional state assessment research has employed neurological, physiological and behavioral monitoring for several decades, but few real world applications have emerged in safety systems. For instance, physiological monitoring of flight crews is done experimentally, but is generally not available for normal operations despite safety incentives. This presentation will address critical challenges in research and development of monitoring solutions, and how they can be overcome. We will consider three applications: health monitoring in exploration class space mission crews; vigilance monitoring in civilian commercial airline crews; and pilot state assessment in military flight training

    Avionics Touch Screen in Turbulence: Simulation for Design (Part 2: Results)

    Get PDF
    Consumer market touch screens ubiquity has driventhe avionics industry to launch in depth evaluations of touch screen forcockpit integration. This paperis a follow-up from ISAP 2015 paperwhere a methodology for turbulence simulation design wasdiscussed. One of the challenges wasto verify touch screen compatibility with in flight use under turbulent conditions, ranging fromlightto severe. The avionics industryrecognized early on the need to alleviate such usability risk and the results of our evaluationsenabled us to define recommendations for our HMI designs. Using our validated turbulent profiles, basic touch screen interaction performances were analyzedand this paper will focus on the results we gathered using our turbulence simulator

    Avionics Touch Screen in Turbulence: Simulation for Design

    Get PDF
    As touch screens are everywhere in the consumer market Thales has launched in depth evaluations on their introduction in the cockpit. One of the challenges is to verify its compatibility with in flight use under turbulence conditions, including light, moderate and severe. In flight accelerometer collections were performed to provide us with a baseline for choosing between possible simulation solutions. Thales recognized early on the need for such a tool as it would enable us to define recommendations for our HMI designs. The objectives were first to validate specific complex touch/gestures using all the potential of touch interactions for novel cockpit Human Machine Interfaces and second to look into the various physical anchoring solutions capable of facilitating touch screens interactions in aeronautical turbulent environments. Given the 6 axis accelerometer profiles that were collected, a number of potential candidate simulation platforms were selected. They were reviewed in terms of performance and cost. Our final candidate is an Hexapod structure capable of reproducing those profiles with acceptable validity. This paper presents the works that enabled us to validate such an hexapod as a viable simulator for our tests and the development of an avionics platform for touch interactions under light to severe turbulences. Pilots were asked to evaluate 6 simulated profiles designed to mimic the “inflight” references. Tests were performed to validate the best profiles for each level of turbulence. The selected profiles were then used to evaluate our touch screen propositions in light, moderate and severe turbulent conditions. Preliminary results are presented

    A Cognitive Engineering Approach for Showing Feasibility Margins on an In-Flight Planning

    Get PDF
    The purpose of the ASAP (Anticipation Support for Aeronautical Planning) project was to design an anticipation support for civilian pilots. In this context, we undertook a cognitive engineering approach. Interviews with pilots and in-situ tasks analysis were performed. Helping pilots better anticipate can consist in showing them the room for maneuver for every single task to perform. At first an activity modeling serves as a basis for describing these tasks during a specific phase of the flight (descent/approach). It confirms a need for a visual representation of temporal feasibility margins. According to the constraints of the flight plan (e.g. speed, altitude…) our algorithm dynamically computes the local extreme values for every main flight variables. The tasks performed in this sodefined multivariate tunnel are guaranteed to meet the flight path requirements. The design process and the canvas of our algorithm are presented in this paper. Directions are discussed to evaluate such an algorithm

    Cooperative Perception in Multisensor Environment: the Case of the Tigre Helicopter

    Get PDF
    How do Tigre pilots build a coherent situation awareness (SA) of the night world through their multiple sensors (IR and I2)? These bring numerous opportunities for pilots to misunderstand each other because of field of view (FOV), wavelength spectrum and point of view differences. After a brief review of the literature on how operators build an SA of the world, we present the field project developed to analyze the impact of sensor commonality and diversity. Realistic situations are recorded to witness how Tigre pilots create a mental model of the situation and develop a collaborative strategy. Crew co-construction of sense is considered through verbal and Human Machine Interface (HMI) mediated exchanges

    Which Ocular Dominance Should Be Considered for Monocular Augmented Reality Devices?

    Get PDF
    A monocular augmented reality device allows the user to see information that is superimposed on the environment. As it does not stimulate both eyes in the same way, it creates a phenomenon known as binocular rivalry. The question therefore arises as to whether monocular information should be displayed to a particular eye and if an ocular dominance test can determine it. This paper contributes to give a better understanding of ocular dominance by comparing nine tests. Our results suggest that ocular dominance can be divided into sighting and sensorial dominance. However, different sensorial dominance tests give different results, suggesting that it is composed of distinct components that are assessed by different tests. There is a need for a comprehensive test that can consider all of these components, in order to identify on which eye monocular information should be directed to when using monocular augmented reality devices
    corecore