slides

Effect of attention on multimodal cue integration

Abstract

Humans gather information about their environment from multiple sensory channels. It seems that cues from separate sensory modalities (e.g. vision and haptics) are combined in a statistically optimal way according to a maximum-likelihood estimator [1]. Ernst and Banks showed that for bi-modal perceptual estimates, the weight attributed to one sensory channel changes when its relative reliability is modified by increasing the noise associated to its signal. Because increasing the attentional load of a given sensory channel is likely to change its reliability, we assume that such modification would also alter the weight of the different cues for multimodal perceptual estimates. Here we examine this hypothesis using a dual-task paradigm. Subjects’ main-task is to estimate the size of a raised bar using vision alone, haptics alone, or both modalities combined. Their performance in the main-task condition alone is compared to the performance obtained when an additional visual ‘distractor’-task is performed simultaneously to the main-task (Dual-Task Paradigm). We found that vision-based estimates are more affected by a visual ‘distractor’ than the haptics-based estimates. Our findings substantiate that attention influences the weighting of the different sensory channels for multimodal perceptual estimates. That is, when attention is detracted from the visual modality, the haptic estimates are consequently weighted higher in visual-haptic size discrimination. In further experiments, we will examine the influence of a haptic ‘distractor’-task. We would expect, that a haptic ‘distractor’ interferes to a higher extend with the haptic primary task. The vision-based estimates in the main-task should be less affected. We will then further examine whether cue integration is still statistically optimal

    Similar works