12 research outputs found

    Distortion of auditory space during visually induced self-motion in depth

    No full text
    Perception of self-motion is based on the integration of multiple sensory inputs, in particular from the vestibular and visual systems. Our previous study demonstrated that vestibular linear acceleration information distorted auditory space perception [Teramoto et al., 2012, PLoS ONE, 7(6): e39402]. However, it is unclear whether this phenomenon is contingent on vestibular signals or whether it can be caused by inputs from other sensory modalities involved in self-motion perception. Here, we investigated whether visual linear self-motion information can also alter auditory space perception. Large-field visual motion was presented to induce self-motion perception with constant accelerations (Experiment 1) and a constant velocity (Experiment 2) either in a forward or backward direction. During participantsā€™ experience of self-motion, a short noise burst was delivered from one of the loudspeakers aligned parallel to the motion direction along a wall to the left of the listener. Participants indicated from which direction the sound was presented, forward or backward, relative to their coronal (i.e., frontal) plane. Results showed that the sound position aligned with the subjective coronal plane was significantly displaced in the direction of self-motion, especially in the backward self-motion condition as compared with a no motion condition. These results suggest that self-motion information, irrespective of its origin, is crucial for auditory space perception

    Sound Localization in the Coexistence of Visually Induced Self-Motion and Vestibular Information

    No full text
    During movement, the position of a sound object relative to an observer continuously changes. Nevertheless, the sound source position can be localized. Thus, self-motion information can possibly be used to perceive stable sound space. We investigated the effect of self-motion perception induced by visual stimuli, ie,, vection and/or vestibular information, on sound localization. To enable perception of vection, random dots moving laterally on a wide screen were presented. For presentation of vestibular stimuli, a three-degree of freedom (3 DOF) motion platform which inclines right or left was employed. Sound stimuli were presented behind a screen when an observer perceived self-motion induced by visual stimuli and/or the platform. The observer's task was to point to the position of the sound image on the screen. Experimental results showed that the perceived sound position shifted to the direction opposite the perceived self-motion induced by visual information, regardless of the direction of vestibular information. Moreover, this tendency was observed only on the side of the median sagittal plane whose direction was the same as that of the movement of visual information. Thus, auditory spatial perception is possibly changed by self-motion due to the coexistence of visually induced self-motion and vestibular information

    The Effect of Reaction Force Feedback on Object-Insert Work in Virtual Reality Environment

    No full text
    It has been reported that the teleoperational efficiency using stereoscopic video images is inferior to the working efficiency of using the naked eye in real environments. One of the causes of this is the deficiency of the amount of information for operators. The big improvement on the working efficiency was shown when tactual feedback was given to operators in addition to the sight information. It is not clear how reaction force information helps the improvement in working efficiency. In this report we investigated the effect of the amount of reaction force feedback under sufficient and insufficient visual informational environments. Subjects were instructed to insert cylinders into the holes on the blocks in the virtual reality environment. The results showed that the force feedback brought about a shorter work completion time and lessened the number of errors. However, there are no significant differences both among the work completion times and the numbers of different amounts of reaction force. This result suggests that the tactual cue is effective to improve task performances of the teleoperation
    corecore