4,005 research outputs found

    Pain modulation by illusory body rotation: A new way to disclose the interaction between the vestibular system and pain processing

    Get PDF
    Background Clinical and experimental evidence advocates a structural and functional link between the vestibular and other sensory systems. For instance, visuo‐vestibular and vestibular–somatosensory interactions have been widely reported. However, whether visual inputs carrying vestibular information can modulate pain is not yet clear. Recent evidence using natural vestibular stimulation or moving visual stimuli, points at an unspecific effect of distraction. Methods By using immersive virtual reality (VR), we created a new way to prompt the vestibular system through the vision of static visual cues, studying the possible interaction with pain. Twenty‐four healthy participants were visually immersed in a virtual room which could appear with five different degrees of rotation in the sagittal axis, either towards the right, left or with no rotation. Participants' heat pain thresholds and subjective reports of perceived body rotation, sense of presence and attention were measured. Results ‘Being’ in a tilted room induced the sensation of body rotation in our participants, even though they were always in an upright position. We also found that rotating the visual scenario can modulate the participants' pain thresholds, determining a significant increase when a left tilt is displayed. In addition, a positive correlation between the perceived body midline rotation and pain threshold was found when the virtual room was titled 15 degrees toward the left. Importantly, all VR conditions were found to be equally distractive. Conclusions Vestibular information present in static visual cues can modulate experimentally‐induced acute pain according to a side‐dependent manner and bypassing supramodal attentional mechanisms. These findings may help refining pain management approaches based on multimodal stimulation. Significance This study explored how the visualization of static environments in immersive virtual reality can lead to pain threshold modulation through the activation of the vestibular system. Immersion into rotated virtual environments led to the illusory sensation of body rotation, and this sensation was found to be related with a modulation of pain perception. Possible analgesic effects due to distraction could be ruled out. These results expand our current knowledge about how the visual, vestibular and somatosensory (pain) systems interact. These findings may influence future pain treatment strategies based on multisensory stimulation

    The contribution of closed loop tracking control of motion platform on laterally induced postural instability of the drivers at SAAM dynamic simulator

    Get PDF
    This paper explains the effect of a motion platform closed loop control comparing to the static condition for driving simulators on postural instability. The postural instabilities of the participants (N=18, 15 male and 3 female subjects) were measured as lateral displacements of subject body centre of pressure (YCP ) just before and after each driving session via a balance platform. After having completed the experiments, the two-tailed Mann-Whitney U test was applied to analyze the objective data for merely the post-exposure cases. The objective data analysis revealed that the YCP for the dynamic case indicated a significant lower value than the static situation (U(18), p < 0,0001). It can be concluded that the closed loop tracking control of the hexapod platform of the driving simulator (dynamic platform condition) decreased significantly the lateral postural stability compared to the static operation condition. However the two-tailed Mann-Whitney U test showed that no significant difference was obtained between the two conditions in terms of psychophysical perception

    The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment

    Get PDF
    Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as 'presence', when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user's overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience

    Motion sickness evaluation and comparison for a static driving simulator and a dynamic driving simulator

    Get PDF
    This paper deals with driving simulation and in particular with the important issue of motion sickness. The paper proposes a methodology to evaluate the objective illness rating metrics deduced from the motion sickness dose value and questionnaires for both a static simulator and a dynamic simulator. Accelerations of the vestibular cues (head movements) of the subjects were recorded with and without motion platform activation. In order to compare user experiences in both cases, the head-dynamics-related illness ratings were computed from the obtained accelerations and the motion sickness dose values. For the subjective analysis, the principal component analysis method was used to determine the conflict between the subjective assessment in the static condition and that in the dynamic condition. The principal component analysis method used for the subjective evaluation showed a consistent difference between the answers given in the sickness questionnaire for the static platform case from those for the dynamic platform case. The two-tailed Mann–Whitney U test shows the significance in the differences between the self-reports to the individual questions. According to the two-tailed Mann–Whitney U test, experiencing nausea (p = 0.019 < 0.05) and dizziness (p = 0.018 < 0.05) decreased significantly from the static case to the dynamic case. Also, eye strain (p = 0.047 < 0.05) and tiredness (p = 0.047 < 0.05) were reduced significantly from the static case to the dynamic case. For the perception fidelity analysis, the Pearson correlation with a confidence interval of 95% was used to study the correlations of each question with the x illness rating component IRx, the y illness rating component IRy, the z illness rating component IRz and the compound illness rating IRtot. The results showed that the longitudinal head dynamics were the main element that induced discomfort for the static platform, whereas vertical head movements were the main factor to provoke discomfort for the dynamic platform case. Also, for the dynamic platform, lateral vestibular-level dynamics were the major element which caused a feeling of fear

    I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR

    Get PDF
    This paper explores the use of VR Head Mounted Displays (HMDs) in-car and in-motion for the first time. Immersive HMDs are becoming everyday consumer items and, as they offer new possibilities for entertainment and productivity, people will want to use them during travel in, for example, autonomous cars. However, their use is confounded by motion sickness caused in-part by the restricted visual perception of motion conflicting with physically perceived vehicle motion (accelerations/rotations detected by the vestibular system). Whilst VR HMDs restrict visual perception of motion, they could also render it virtually, potentially alleviating sensory conflict. To study this problem, we conducted the first on-road and in motion study to systematically investigate the effects of various visual presentations of the real-world motion of a car on the sickness and immersion of VR HMD wearing passengers. We established new baselines for VR in-car motion sickness, and found that there is no one best presentation with respect to balancing sickness and immersion. Instead, user preferences suggest different solutions are required for differently susceptible users to provide usable VR in-car. This work provides formative insights for VR designers and an entry point for further research into enabling use of VR HMDs, and the rich experiences they offer, when travelling

    The contributions of visual flow and locomotor cues to walked distance estimation in a virtual environment

    Get PDF
    Traversed distance perception involves estimating the extent of self-motion as one travels from one position in space to another. As such, it is a multi-modal experience in which information from both visual flow and locomotor cues (i.e. proprioceptive, efference copy and vestibular cues) jointly specify the magnitude of self-motion. While recent evidence has demonstrated the extent to which each of these cues can be used independently to estimate traversed distance, relatively little is known about how they are integrated when simultaneously present. Evaluating multi-modal cue integration in the context of dynamic locomotor behaviour is important to both understanding issues related to self-motion perception, as well as perceptual-motor coupling in real and virtual environments

    Can Galvanic Vestibular Stimulation Reduce Simulator Adaptation Syndrome?

    Full text link
    Electrical stimulation of the vestibular sensory system during virtual environment simulations has been proposed as a method to reduce the incidence of simulator adaptation syndrome (SAS). However, there is limited empirical evidence to support this hypothesis. It is especially important to provide vestibular stimulation in driving simulators because an absence of vestibular cues may alter driver behaviour and reduce vehicle control. This study examined the application of galvanic vestibular stimulation (GVS) as a technique to reduce symptoms of SAS and improve vehicular control in a fixed-based driving simulator. Nineteen participants drove two visually distinct virtual environments (high and low visual cues). In addition, each of these worlds was experienced with and without GVS. Post-drive scores on the Simulator Sickness Questionnaire (SSQ) were used to evaluate the effect of GVS on SAS. In addition, three driving variables were measured to examine driving performance: steering variability, lane departures, and average vehicular speed. GVS application while driving resulted in significant decreases in total SSQ and disorientation symptoms. Greater vehicular control was also observed (as shown by reduced steering variability) when GVS was used in combination with visual cues along the simulated edge of the road. These results support that GVS may be used in fixed-base driving simulators to create vestibular motion cues and reduce SAS

    Angular relation of axes in perceptual space

    Get PDF
    The geometry of perceptual space needs to be known to model spatial orientation constancy or to create virtual environments. To examine one main aspect of this geometry, the angular relation between the three spatial axes was measured. Experiments were performed consisting of a perceptual task in which subjects were asked to set independently their apparent vertical and horizontal plane. The visual background provided no other stimuli to serve as optical direction cues. The task was performed in a number of different body tilt positions with pitches and rolls varied in steps of 30 degs. The results clearly show the distortion of orthogonality of the perceptual space for nonupright body positions. Large interindividual differences were found. Deviations from orthogonality up to 25 deg were detected in the pitch as well as in the roll direction. Implications of this nonorthogonality on further studies of spatial perception and on the construction of virtual environments for human interaction is also discussed

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Perception Of Visual Speed While Moving

    Get PDF
    During self-motion, the world normally appears stationary. In part, this may be due to reductions in visual motion signals during self-motion. In 8 experiments, the authors used magnitude estimation to characterize changes in visual speed perception as a result of biomechanical self-motion alone (treadmill walking), physical translation alone (passive transport), and both biomechanical self-motion and physical translation together (walking). Their results show that each factor alone produces subtractive reductions in visual speed but that subtraction is greatest with both factors together, approximating the sum of the 2 separately. The similarity of results for biomechanical and passive self-motion support H. B. Barlow\u27s (1990) inhibition theory of sensory correlation as a mechanism for implementing H. Wallach\u27s (1987) compensation for self-motion. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract
    • 

    corecore