22 research outputs found

    Walking through a virtual environment improves perceived size within and beyond the walked space

    Get PDF
    Distances tend to be underperceived in virtual environments (VEs) by up to 50%, whereas distances tend to be perceived accurately in the real world. Previous work has shown that allowing participants to interact with the VE while receiving continual visual feedback can reduce this underperception. Judgments of virtual object size have been used to measure whether this improvement is due to the rescaling of perceived space, but there is disagreement within the literature as to whether judgments of object size benefit from interaction with feedback. This study contributes to that discussion by employing a more natural measure of object size. We also examined whether any improvement in virtual distance perception was limited to the space used for interaction (1–5 m) or extended beyond (7–11 m). The results indicated that object size judgments do benefit from interaction with the VE, and that this benefit extends to distances beyond the explored space

    Viewing medium affects arm motor performance in 3D virtual environments

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>2D and 3D virtual reality platforms are used for designing individualized training environments for post-stroke rehabilitation. Virtual environments (VEs) are viewed using media like head mounted displays (HMDs) and large screen projection systems (SPS) which can influence the quality of perception of the environment. We estimated if there were differences in arm pointing kinematics when subjects with and without stroke viewed a 3D VE through two different media: HMD and SPS.</p> <p>Methods</p> <p>Two groups of subjects participated (healthy control, n = 10, aged 53.6 ± 17.2 yrs; stroke, n = 20, 66.2 ± 11.3 yrs). Arm motor impairment and spasticity were assessed in the stroke group which was divided into mild (n = 10) and moderate-to-severe (n = 10) sub-groups based on Fugl-Meyer Scores. Subjects pointed (8 times each) to 6 randomly presented targets located at two heights in the ipsilateral, middle and contralateral arm workspaces. Movements were repeated in the same VE viewed using HMD (Kaiser XL50) and SPS. Movement kinematics were recorded using an Optotrak system (Certus, 6 markers, 100 Hz). Upper limb motor performance (precision, velocity, trajectory straightness) and movement pattern (elbow, shoulder ranges and trunk displacement) outcomes were analyzed using repeated measures ANOVAs.</p> <p>Results</p> <p>For all groups, there were no differences in endpoint trajectory straightness, shoulder flexion and shoulder horizontal adduction ranges and sagittal trunk displacement between the two media. All subjects, however, made larger errors in the vertical direction using HMD compared to SPS. Healthy subjects also made larger errors in the sagittal direction, slower movements overall and used less range of elbow extension for the lower central target using HMD compared to SPS. The mild and moderate-to-severe sub-groups made larger RMS errors with HMD. The only advantage of using the HMD was that movements were 22% faster in the moderate-to-severe stroke sub-group compared to the SPS.</p> <p>Conclusions</p> <p>Despite the similarity in majority of the movement kinematics, differences in movement speed and larger errors were observed for movements using the HMD. Use of the SPS may be a more comfortable and effective option to view VEs for upper limb rehabilitation post-stroke. This has implications for the use of VR applications to enhance upper limb recovery.</p

    Comparison of Two Methods for Improving Distance Perception in Virtual Reality

    Get PDF
    Distance is commonly underperceived in virtual environments (VEs) compared to real environments. Past work suggests that displaying a replica VE based on the real surrounding environment leads to more accurate judgments of distance, but that work has lacked the necessary control conditions to firmly make this conclusion. Other research indicates that walking through a VE with visual feedback improves judgments of distance and size. This study evaluated and compared those two methods for improving perceived distance in VEs. All participants experienced a replica VE based on the real lab. In one condition, participants visually previewed the real lab prior to experiencing the replica VE, and in another condition they did not. Participants performed blind-walking judgments of distance and also judgments of size in the replica VE before and after walking interaction. Distance judgments were more accurate in the preview compared to no preview condition, but size judgments were unaffected by visual preview. Distance judgments and size judgments increased after walking interaction, and the improvement was larger for distance than for size judgments. After walking interaction, distance judgments did not differ based on visual preview, and walking interaction led to a larger improvement in judged distance than did visual preview. These data suggest that walking interaction may be more effective than visual preview as a method for improving perceived space in a VE

    Tele-operation and Human Robots Interactions

    Get PDF

    Recalibration of Perceived Distance in Virtual Environments Occurs Rapidly and Transfers Asymmetrically Across Scale

    Get PDF
    Distance in immersive virtual reality is commonly underperceived relative to intended distance, causing virtual environments to appear smaller than they actually are. However, a brief period of interaction by walking through the virtual environment with visual feedback can cause dramatic improvement in perceived distance. The goal of the current project was to determine how quickly improvement occurs as a result of walking interaction (Experiment 1) and whether improvement is specific to the distances experienced during interaction, or whether improvement transfers across scales of space (Experiment 2). The results show that five interaction trials resulted in a large improvement in perceived distance, and that subsequent walking interactions showed continued but diminished improvement. Furthermore, interaction with near objects (1-2 m) improved distance perception for near but not far (4-5 m) objects, whereas interaction with far objects broadly improved distance perception for both near and far objects. These results have practical implications for ameliorating distance underperception in immersive virtual reality, as well as theoretical implications for distinguishing between theories of how walking interaction influences perceived distance

    More than just perception-action recalibration: walking through a virtual environment causes rescaling of perceived space.

    Get PDF
    Egocentric distances in virtual environments are commonly underperceived by up to 50 % of the intended distance. However, a brief period of interaction in which participants walk through the virtual environment while receiving visual feedback can dramatically improve distance judgments. Two experiments were designed to explore whether the increase in postinteraction distance judgments is due to perception–action recalibration or the rescaling of perceived space. Perception–action recalibration as a result of walking interaction should only affect action-specific distance judgments, whereas rescaling of perceived space should affect all distance judgments based on the rescaled percept. Participants made blind-walking distance judgments and verbal size judgments in response to objects in a virtual environment before and after interacting with the environment through either walking (Experiment 1) or reaching (Experiment 2). Size judgments were used to infer perceived distance under the assumption of size–distance invariance, and these served as an implicit measure of perceived distance. Preinteraction walking and size-based distance judgments indicated an underperception of egocentric distance, whereas postinteraction walking and size-based distance judgments both increased as a result of the walking interaction, indicating that walking through the virtual environment with continuous visual feedback caused rescaling of the perceived space. However, interaction with the virtual environment through reaching had no effect on either type of distance judgment, indicating that physical translation through the virtual environment may be necessary for a rescaling of perceived space. Furthermore, the size-based distance and walking distance judgments were highly correlated, even across changes in perceived distance, providing support for the size–distance invariance hypothesis

    Rescaling of perceived space transfers across virtual environments.

    Get PDF
    Research over the past 20 years has consistently shown that egocentric distance is underperceived in virtual environments (VEs) compared with real environments. In 2 experiments, judgments of object distance (Experiment 1) and object size (Experiment 2) improved after a brief period of walking through the VE with continuous visual feedback. Whereas improvement of blind-walking distance judgments could be attributable to recalibration of walking, improvement in perceived size is considered evidence for rescaling of perceived space, whereby perceived size and distance increased after walking interaction. Furthermore, improvements in judged distance and size transferred to a new VE. Distance judgments, but not size judgments, continued to improve after additional walking interaction in the new VE. These results have theoretical implications regarding the effects of walking interaction on perceived space, and practical implications regarding methods of improving perceived distance in VEs. (PsycINFO Database Record (c) 2017 APA, all rights reserved

    Perceived Space in the HTC Vive

    Get PDF
    Underperception of egocentric distance in virtual reality has been a persistent concern for almost 20 years. Modern headmounted displays (HMDs) appear to have begun to ameliorate underperception. The current study examined several aspects of perceived space in the HTC Vive. Blind-walking distance judgments, verbal distance judgments, and size judgments were measured in two distinct virtual environments (VEs)—a high-quality replica of a real classroom and an empty grass field—as well as the real classroom upon which the classroom VE was modeled. A brief walking interaction was also examined as an intervention for improving anticipated underperception in the VEs. Results from the Vive were compared to existing data using two older HMDs (nVisor SX111 and ST50). Blind-walking judgments were more accurate in the Vive compared to the older displays, and did not differ substantially from the real world nor across VEs. Size judgments were more accurate in the classroom VE than the grass VE and in the Vive compared to the older displays. Verbal judgments were significantly smaller in the classroom VE compared to the real classroom and did not significantly differ across VEs. Blind-walking and size judgments were more accurate after walking interaction, but verbal judgments were unaffected. The results indicate that underperception of distance in the HTC Vive is less than in older displays but has not yet been completely resolved. With more accurate space perception afforded by modern HMDs, alternative methods for improving judgments of perceived space—such as walking interaction—may no longer be necessary

    Cognitive Demands of Semi-Natural Virtual Locomotion

    Get PDF
    There is currently no fully natural, general-purpose locomotion interface. Instead, interfaces such as gamepads or treadmills are required to explore large virtual environments (VEs). Furthermore, sensory feedback that would normally be used in real-world movement is often restricted in VR due to constraints such as reduced field of view (FOV). Accommodating these limitations with locomotion interfaces afforded by most virtual reality (VR) systems may induce cognitive demands on the user that are unrelated to the primary task to be performed in the VE. Users of VR systems often have many competing task demands, and additional cognitive demands during locomotion must compete for finite resources. Two studies were previously reported investigating the working memory demands imposed by semi-natural locomotion interfaces (Study 1) and reduced sensory feedback (Study 2). This paper expands on the previously reported results and adds discussion linking the two studies. The results indicated that locomotion with a less natural interface increases spatial working memory demands, and that locomotion with a lower FOV increases general attentional demands. These findings are discussed in terms of their practical implications for selection of locomotion interfaces when designing VEs
    corecore