23 research outputs found

    Where am I in virtual reality?

    Get PDF
    It is currently not well understood whether people experience themselves to be located in one or more specific part(s) of their body. Virtual reality (VR) is increasingly used as a tool to study aspects of bodily perception and self-consciousness, due to its strong experimental control and ease in manipulating multi-sensory aspects of bodily experience. To investigate where people self-locate in their body within virtual reality, we asked participants to point directly at themselves with a virtual pointer, in a VR headset. In previous work employing a physical pointer, participants mainly located themselves in the upper face and upper torso. In this study, using a VR headset, participants mainly located themselves in the upper face. In an additional body template task where participants pointed at themselves on a picture of a simple body outline, participants pointed most often to the upper torso, followed by the (upper) face. These results raise the question as to whether head-mounted virtual reality might alter where people locate themselves making them more “head-centred”

    A Communication Task in HMD Virtual Environments: Speaker and Listener Movement Improves Communication

    No full text
    In this paper we present an experiment which investigates the influence of animated real-time self-avatars in immersive virtual environments on a communication task. Further we investigate the influence of 1st and 3rd person perspectives and the influence of tracked speaker and listener. We find that people perform best in our communication task when both the speaker and the listener have an animated self-avatar and when the speaker is in the 3rd person. The more people move the better they perform in the communication task. These results suggest that when two people in a virtual environment are animated then they do use gestures to communicate

    Circular, linear, and curvilinear vection in a large-screen virtual environment with floor projection

    No full text
    Vection is defined as the compelling sensation of illusory self-motion elicited by a moving sensory, usually visual, stimulus. This paper presents collected introspective data, user discomfort and perceived speed data for the experience of linear, circular, and curvilinear vection in a large-screen, immersive, virtual environment. As a first step we evaluated the effectiveness of a floor projection on the perception of vection for four trajectories: linear forward, linear backward, circular left, and circular right. The floor projection, which considerably extended the field of view, was found to significantly improve the introspective measures of linear, but not circular, vection experienced in a photo-realistic three-dimensional town. In a second study we investigated the differences between 12 different motion trajectories on the illusion of self-motion. In this study we found that linear translations to the left and right are perceived as the least convincing, while linear down is perceived as the most convincing of the linear trajectories. Second, we found that while linear forward vection is not perceived to be very convincing, curvilinear forward vection is reported to be as convincing as circular vection. In a third and final experiment we investigated the perceived speed for all different trajectories and acquired data based on simulator sickness questionnaires to compute a discomfort factor associated with each type of trajectory. Considering our experimental results, we offer suggestions for increasing the sense of self-motion in simulators and VE applications, specifically to increase the number of curvilinear trajectories (as opposed to linear ones) and, if possible, add floor projection in order to improve the illusory sense of self-motion

    Body size perception in stroke patients with paresis

    No full text
    Recent studies have suggested that people’s intent and ability to act also can influence their perception of their bodies’ peripersonal space. Vice versa one could assume that the inability to reach toward and grasp an object might have an impact on the subject’s perception of reaching distance. Here we tested this prediction by investigating body size and action capability perception of neurological patients suffering from arm paresis after stroke, comparing 32 right-brain-damaged patients (13 with left-sided arm paresis without additional spatial neglect, 10 with left-sided arm paresis and additional spatial neglect, 9 patients had neither arm paresis nor neglect) and 27 healthy controls. Nineteen of the group of right hemisphere stroke patients could be re-examined about five months after initial injury. Arm length was estimated in three different methodological approaches: explicit visual, explicit tactile/proprioceptive, and implicit reaching. Results fulfilled the working hypothesis. Patients with an arm paresis indeed perceived their bodies differently. We found a transient overestimation of the length of the contralesional, paretic arm after stroke. Body size and action capability perception for the extremities thus indeed seem to be tightly linked in humans. Copyright: © 2021 Shahvaroughi-Farahani et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    Joint and individual walking in an immersive collaborative virtual environment

    No full text
    The aim of this experiment was to determine to which extent humans optimize their walking behavior in different conditions while navigating in a virtual maze. In two conditions participants either walked individually or jointly connected - carrying a physical stretcher. The results showed that an extra effort due to the task-required cooperation was split evenly within the group, even though the sensory feedback about the physical and social environment was significantly different for leader (e.g. was not able to see the follower) and follower (e.g. was able to see the leader). These results might indicate the emergence of a joint body: a phenomenon in which two individual action-perception loops are tuned towards each other in order to optimize a common goal

    Egocentric distance judgments in a large screen display immersive virtual environment

    No full text
    People underestimate egocentric distances in head-mounted display virtual environments, as compared to estimates done in the real world. Our work investigates whether distances are still compressed in a large screen display immersive virtual environment, where participants are able to see their own body surrounded by the virtual environment. We conducted our experiment in both the real world using a real room and the large screen display immersive virtual environment using a 3D model of the real room. Our results showed a significant underestimation of verbal reports of egocentric distances in the large screen display immersive virtual environment, while the distance judgments of the real world were closer to veridical. Moreover, we observed a significant effect of distances in both environments. In the real world closer distances were slightly underestimated, while further distances were slightly overestimated. In contrast to the real world in the virtual environment participants overestimated closer distanc es (up to 2.5m) and underestimated distances that were further than 3m. A possible reason for this effect of distances in the virtual environment may be that participants perceived stereo cues differently when the target was projected on the floor versus on the front of the large screen
    corecore