Perceived Self Motion in Virtual Acoustic Space Facilitated by Passive Whole-Body Movement

Abstract

Presented at the 14th International Conference on Auditory Display (ICAD2008) on June 24-27, 2008 in Paris, France.When moving sound sources are displayed for a listener in a manner that is consistent with the motion of a listener through an environment populated by stationary sound sources, listeners may perceive that the sources are moving relative to a fixed listening position, rather than experiencing their own self motion (i.e., a change in their listening position). Here, the likelihood of auditory cues producing such self motion (aka auditory-induced vection) can be greatly facilitated by coordinated passive movement of a listener's whole body, which can be achieved when listeners are positioned upon a multi-axis motion platform that is controlled in synchrony with a spatial auditory display. In this study, the temporal synchrony between passive whole-body motion and auditory spatial information was investigated via a multimodal time-order judgment task. For the spatial trajectories taken by sound sources presented here, the observed interaction between passive whole-body motion and sound source motion clearly depended upon the peak velocity reached by the moving sound sources. The results suggest that sensory integration of auditory motion cues with whole-body movement cues can occur over an increasing range of intermodal delays as virtual sound sources are moved increasingly slowly through the space near a listener's position. Furthermore, for the coordinated motion presented in the current study, asynchrony was relatively easy for listeners to tolerate when the peak in whole-body motion occurred earlier in time than the peak in virtual sound source velocity, but quickly grew to be intolerable when the peak in whole-body motion occurred after sound sources reached their peak velocities

    Similar works