1,263 research outputs found
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Real walking in virtual environments for factory planning and evaluation
Nowadays, buildings or production facilities are designed using specialized design software and building information modeling tools help to evaluate the resulting virtual mock-up. However, with current, primarily desktop based tools it is hard to evaluate human factors of such a design, for instance spatial constraints for workforces. This paper presents a new tool for factory planning and evaluation based on virtual reality that allows designers, planning experts, and workforces to walk naturally and freely within a virtual factory. Therefore, designs can be checked as if they were real before anything is built.ISSN:2212-827
LoCoMoTe – a framework for classification of natural locomotion in VR by task, technique and modality
Virtual reality (VR) research has provided overviews of locomotion techniques, how they work, their strengths and overall user experience. Considerable research has investigated new methodologies, particularly machine learning to develop redirection algorithms. To best support the development of redirection algorithms through machine learning, we must understand how best to replicate human navigation and behaviour in VR, which can be supported by the accumulation of results produced through live-user experiments. However, it can be difficult to identify, select and compare relevant research without a pre-existing framework in an ever-growing research field. Therefore, this work aimed to facilitate the ongoing structuring and comparison of the VR-based natural walking literature by providing a standardised framework for researchers to utilise. We applied thematic analysis to study methodology descriptions from 140 VR-based papers that contained live-user experiments. From this analysis, we developed the LoCoMoTe framework with three themes: navigational decisions, technique implementation, and modalities. The LoCoMoTe framework provides a standardised approach to structuring and comparing experimental conditions. The framework should be continually updated to categorise and systematise knowledge and aid in identifying research gaps and discussions
Inattentional Blindness for Redirected Walking Using Dynamic Foveated Rendering
Redirected walking is a Virtual Reality(VR) locomotion technique which
enables users to navigate virtual environments (VEs) that are spatially larger
than the available physical tracked space. In this work we present a novel
technique for redirected walking in VR based on the psychological phenomenon of
inattentional blindness. Based on the user's visual fixation points we divide
the user's view into zones. Spatially-varying rotations are applied according
to the zone's importance and are rendered using foveated rendering. Our
technique is real-time and applicable to small and large physical spaces.
Furthermore, the proposed technique does not require the use of stimulated
saccades but rather takes advantage of naturally occurring saccades and blinks
for a complete refresh of the framebuffer. We performed extensive testing and
present the analysis of the results of three user studies conducted for the
evaluation
Locomotion in virtual reality in full space environments
Virtual Reality is a technology that allows the user
to explore and interact with a virtual environment in
real time as if they were there. It is used in various
fields such as entertainment, education, and medicine
due to its immersion and ability to represent reality.
Still, there are problems such as virtual simulation
sickness and lack of realism that make this technology
less appealing. Locomotion in virtual environments is
one of the main factors responsible for an immersive and
enjoyable virtual reality experience. Several methods
of locomotion have been proposed, however, these
have flaws that end up negatively influencing the
experience. This study compares natural locomotion in
complete spaces with joystick locomotion and natural
locomotion in impossible spaces through three tests
in order to identify the best locomotion method in
terms of immersion, realism, usability, spatial knowledge
acquisition and level of virtual simulation sickness. The
results show that natural locomotion is the method
that most positively influences the experience when
compared to the other locomotion methods.A Realidade Virual é uma tecnologia que permite
ao utilizador explorar e interagir com um ambiente
virtual em tempo real como se lá estivesse presente.
E utilizada em diversas áreas como o entretenimento, educação e medicina devido à sua imersão e capacidade
de representar a realidade. Ainda assim, existem
problemas como o enjoo por simulação virtual e a
falta de realismo que tornam esta tecnologia menos
apelativa. A locomoção em ambientes virtuais é um dos
principais fatores responsáveis por uma experiência em
realidade virtual imersiva e agradável. Vários métodos
de locomoção foram propostos, no entanto, estes têm
falhas que acabam por influenciar negativamente a
experiência. Este estudo compara a locomoção natural
em espaços completos com a locomoção por joystick e
a locomoção natural em espaços impossíveis através de
três testes de forma a identificar qual o melhor método
de locomoção a nível de imersão, realismo, usabilidade,
aquisição de conhecimento espacial e nível de enjoo
por simulação virtual. Os resultados mostram que
a locomoção natural é o método que mais influencia
positivamente a experiência quando comparado com os
outros métodos de locomoção
ARC: Alignment-based Redirection Controller for Redirected Walking in Complex Environments
We present a novel redirected walking controller based on alignment that
allows the user to explore large and complex virtual environments, while
minimizing the number of collisions with obstacles in the physical environment.
Our alignment-based redirection controller, ARC, steers the user such that
their proximity to obstacles in the physical environment matches the proximity
to obstacles in the virtual environment as closely as possible. To quantify a
controller's performance in complex environments, we introduce a new metric,
Complexity Ratio (CR), to measure the relative environment complexity and
characterize the difference in navigational complexity between the physical and
virtual environments. Through extensive simulation-based experiments, we show
that ARC significantly outperforms current state-of-the-art controllers in its
ability to steer the user on a collision-free path. We also show through
quantitative and qualitative measures of performance that our controller is
robust in complex environments with many obstacles. Our method is applicable to
arbitrary environments and operates without any user input or parameter
tweaking, aside from the layout of the environments. We have implemented our
algorithm on the Oculus Quest head-mounted display and evaluated its
performance in environments with varying complexity. Our project website is
available at https://gamma.umd.edu/arc/
Multimodality in {VR}: {A} Survey
Virtual reality has the potential to change the way we create and consume content in our everyday life. Entertainment, training, design and manufacturing, communication, or advertising are all applications that already benefit from this new medium reaching consumer level. VR is inherently different from traditional media: it offers a more immersive experience, and has the ability to elicit a sense of presence through the place and plausibility illusions. It also gives the user unprecedented capabilities to explore their environment, in contrast with traditional media. In VR, like in the real world, users integrate the multimodal sensory information they receive to create a unified perception of the virtual world. Therefore, the sensory cues that are available in a virtual environment can be leveraged to enhance the final experience. This may include increasing realism, or the sense of presence; predicting or guiding the attention of the user through the experience; or increasing their performance if the experience involves the completion of certain tasks. In this state-of-the-art report, we survey the body of work addressing multimodality in virtual reality, its role and benefits in the final user experience. The works here reviewed thus encompass several fields of research, including computer graphics, human computer interaction, or psychology and perception. Additionally, we give an overview of different applications that leverage multimodal input in areas such as medicine, training and education, or entertainment; we include works in which the integration of multiple sensory information yields significant improvements, demonstrating how multimodality can play a fundamental role in the way VR systems are designed, and VR experiences created and consumed
Multimodality in VR: A survey
Virtual reality (VR) is rapidly growing, with the potential to change the way we create and consume content. In VR, users integrate multimodal sensory information they receive, to create a unified perception of the virtual world. In this survey, we review the body of work addressing multimodality in VR, and its role and benefits in user experience, together with different applications that leverage multimodality in many disciplines. These works thus encompass several fields of research, and demonstrate that multimodality plays a fundamental role in VR; enhancing the experience, improving overall performance, and yielding unprecedented abilities in skill and knowledge transfer
- …