10 research outputs found

    Crew Resource Management for Automated Teammates (CRM-A)

    Get PDF
    Crew Resource Management (CRM) is the application of human factors knowledge and skills to ensure that teams make effective use of all resources. This includes ensuring that pilots bring in opinions of other teammates and utilize their unique capabilities. CRM was originally developed 40 years ago in response to a number of airline accidents in which the crew was found to be at fault. The goal was to improve teamwork among airline cockpit crews. The notion of "team" was later expanded to include cabin crew and ground resources. CRM has also been adopted by other industries, most notably medicine. Automation research now finds itself faced with similar issues to those faced by aviation 40 years ago: how to create a more robust system by making full use of both the automation and its human operators. With advances in machine intelligence, processing speed and cheap and plentiful memory, automation has advanced to the point that it can and should be treated as a teammate to fully take advantage of its capabilities and contributions to the system. This area of research is known as Human-Autonomy Teaming (HAT). Research on HAT has identified reusable patterns that can be applied in a wide range of applications. These patterns include features such as bi-directional communication and working agreements. This paper will explore the synergies between CRM and HAT. We believe that HAT research has much to learn from CRM and that there are benefits to expanding CRM to cover automation

    Assessing the effect of sound complexity on the audiotactile cross-modal dynamic capture task.

    No full text
    Neurophysiological and behavioural evidence now show that audiotactile interactions are more pronounced for complex auditory stimuli than for pure tones. In the present study, we examined the effect of varying the complexity of auditory stimuli (i.e., noise vs. pure tone) on participants' performance in the audiotactile cross-modal dynamic capture task. Participants discriminated the direction of a target stream (tactile or auditory) while simultaneously trying to ignore the direction of a distracting auditory or tactile apparent motion stream presented in a different sensory modality (i.e., auditory or tactile). The distractor stream could be either spatiotemporally congruent or incongruent with respect to the target stream on each trial. The results showed that sound complexity modulated performance, decreasing the accuracy of tactile direction judgements when presented simultaneously with noise distractors, while facilitating judgements of the direction of the noise bursts (as compared to pure tones). Although auditory direction judgements were overall more accurate for noise (than for pure tone) targets, the complexity of the sound failed to modulate the tactile capture of auditory targets. These results provide the first demonstration of enhanced audiotactile interactions involving complex (vs. pure tone) auditory stimuli in the peripersonal space around the hands (previously these effects have only been reported in the space around the head)
    corecore