4,211 research outputs found

    Evaluation of Reorientation Techniques and Distractors for Walking in Large Virtual Environments

    Get PDF
    Virtual Environments (VEs) that use a real-walking locomotion interface have typically been restricted in size to the area of the tracked lab space. Techniques proposed to lift this size constraint, enabling real walking in VEs that are larger than the tracked lab space, all require reorientation techniques (ROTs) in the worst-case situation–when a user is close to walking out of the tracked space. We propose a new ROT using visual and audial distractors–objects in the VE that the user focuses on while the VE rotates–and compare our method to current ROTs through three user studies. ROTs using distractors were preferred and ranked more natural by users. Users were also less aware of the rotating VE when ROTs with distractors were used. Our findings also suggest that improving visual realism and adding sound increased a user's feeling of presence

    Optimizing Natural Walking Usage in VR using Redirected Teleportation

    Get PDF
    Virtual Reality (VR) has come a long way since its inception and with the recent advancements in technology, high end VR headsets are now commercially available. Although these headsets offer full motion tracking capabilities, locomotion in VR is yet to be fully solved due to space constraints, potential VR sickness and problems with retaining immersion. Teleportation is the most popular locomotion technique in VR as it allows users to safely navigate beyond the confines of the available positional tracking space without inducing VR sickness. It has been argued that the use of teleportation doesn’t facilitate the use of natural walking input which is considered to have a higher presence because teleportation is faster, requires little physical effort and uses limited available tracking space. When a user walks to the edge of the tracking space, he/she must switch to teleportation. When navigating in the same direction, available walking space does not increase, which forces users to remain stationary and continue using teleportation. We present redirected teleportation, a novel locomotion method that increases tracking space usage and natural walking input by subtle reorientation and repositioning of the user. We first analyzed the positional tendencies of the users as they played popular games implementing teleportation and found the utilization of the tracking space to be limited. We then compared redirected teleportation with regular teleportation using a navigation task in three different environments. Analysis of our data show that although redirected walking takes more time, users used significantly fewer teleports and more natural walking input while using more of the available tracking space. The increase in time is largely due to users walking more, which takes more time than using teleportation. Our results provide evidence that redirected teleportation may be a viable approach to increase the usage of natural walking input while decreasing the dependency on teleportation

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Real walking in virtual environments for factory planning and evaluation

    Get PDF
    Nowadays, buildings or production facilities are designed using specialized design software and building information modeling tools help to evaluate the resulting virtual mock-up. However, with current, primarily desktop based tools it is hard to evaluate human factors of such a design, for instance spatial constraints for workforces. This paper presents a new tool for factory planning and evaluation based on virtual reality that allows designers, planning experts, and workforces to walk naturally and freely within a virtual factory. Therefore, designs can be checked as if they were real before anything is built.ISSN:2212-827

    Natural Walking in Virtual Reality:A Review

    Get PDF

    Using Locomotion Models for Estimating Walking Targets in Immersive Virtual Environments

    Get PDF

    The use of embedded context-sensitive attractors for clinical walking test guidance in virtual reality

    Get PDF
    Virtual reality is increasingly used in rehabilitation and can provide additional motivation when working towards therapeutic goals. However, a particular problem for patients regards their ability to plan routes in unfamiliar environments. Therefore, the aim of this study was to explore how visual cues, namely embedded context-sensitive attractors, can guide attention and walking direction in VR, for clinical walking interventions. This study was designed using a butterfly as the embedded context- sensitive attractor, to guide participant locomotion around the clinical figure of eight walk test, to limit the use of verbal instructions. We investigated the effect of varying the number of attractors for figure of eight path following, and whether there are any negative impacts on perceived autonomy or workload. A total of 24 participants took part in the study and completed six attractor conditions in a counterbalanced order. They also experienced a control VE (no attractors) at the beginning and end of the protocol. Each VE condition lasted a duration of 1 minute and manipulated the number of attractors to either singular or multiple alongside, the placement of turning markers (virtual trees) used to represent the cones used in clinical settings for the figure of eight walk test. Results suggested that embedded context-sensitive attractors can be used to guide walking direction, following a figure of eight in VR without impacting perceived autonomy, and workload. However, there appears to be a saturation point, with regards to effectiveness of attractors. Too few objects in a VE may reduce feelings of intrinsic motivation, and too many objects in a VE may reduce the effectiveness of attractors for guiding individuals along a figure of eight path. We conclude by indicating future research directions, for attractors and their use as a guide for walking direction

    Navigating Immersive and Interactive VR Environments With Connected 360° Panoramas

    Get PDF
    Emerging research is expanding the idea of using 360-degree spherical panoramas of real-world environments for use in 360 VR experiences beyond video and image viewing. However, most of these experiences are strictly guided, with few opportunities for interaction or exploration. There is a desire to develop experiences with cohesive virtual environments created with 360 VR that allow for choice in navigation, versus scripted experiences with limited interaction. Unlike standard VR with the freedom of synthetic graphics, there are challenges in designing appropriate user interfaces (UIs) for 360 VR navigation within the limitations of fixed assets. To tackle this gap, we designed RealNodes, a software system that presents an interactive and explorable 360 VR environment. We also developed four visual guidance UIs for 360 VR navigation. The results of a pilot study showed that choice of UI had a significant effect on task completion times, showing one of our methods, Arrow, was best. Arrow also exhibited positive but non-significant trends in average measures with preference, user engagement, and simulator-sickness. RealNodes, the UI designs, and the pilot study results contribute preliminary information that inspire future investigation of how to design effective explorable scenarios in 360 VR and visual guidance metaphors for navigation in applications using 360 VR environments
    • …
    corecore