3,454 research outputs found
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Recommended from our members
Trends in virtual reality technologies for the learning patient
NextMed convened the Medicine Meets Virtual Reality 22 (MMVR 22) conference in 2016. Since 1992, the conference has brought together a diverse group of researchers to share creative solutions for the evolving challenge of integrating virtual reality tools into medical education. Virtual reality (VR) and its enabling technologies utilize hardware and software to simulate environments and encounters where users can interact and learn. The MMVR 22 symposium proceedings contain projects that support a variety of learners: medical students, practitioners, soldiers, and patients. This report will contemplate the trends in virtual reality technologies for patients navigating their medical and healthcare learning. The learning patient seeks more than intervention; they seek prevention. From virtual humans and environments to motion sensors and haptic devices, patients are surrounded by increasingly rich and transformative data-driven tools. Applied data enables VR applications to simulate experience, predict health outcomes, and motivate new behavior. The MMVR 22 presents investigations into the usability of wearable devices, the efficacy of avatar inclusion, and the viability of multi-player gaming. With increasing need for individualized and scalable programming, only committed open source efforts will align instructional designers, technology integrators, trainers, and clinicians. Curriculum and InstructionCurriculum and Instructio
Haptic Feedback Relocation from the Fingertips to the Wrist for Two-Finger Manipulation in Virtual Reality
Relocation of haptic feedback from the fingertips to the wrist has been
considered as a way to enable haptic interaction with mixed reality virtual
environments while leaving the fingers free for other tasks. We present a pair
of wrist-worn tactile haptic devices and a virtual environment to study how
various mappings between fingers and tactors affect task performance. The
haptic feedback rendered to the wrist reflects the interaction forces occurring
between a virtual object and virtual avatars controlled by the index finger and
thumb. We performed a user study comparing four different finger-to-tactor
haptic feedback mappings and one no-feedback condition as a control. We
evaluated users' ability to perform a simple pick-and-place task via the
metrics of task completion time, path length of the fingers and virtual cube,
and magnitudes of normal and shear forces at the fingertips. We found that
multiple mappings were effective, and there was a greater impact when visual
cues were limited. We discuss the limitations of our approach and describe next
steps toward multi-degree-of-freedom haptic rendering for wrist-worn devices to
improve task performance in virtual environments.Comment: 6 pages, 9 figures, 1 table, submitted and accepted to the IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS) 2022
Conferenc
Communicating with feeling
Communication between users in shared editors takes place in a deprived environment - distributed users find it difficult to communicate. While many solutions to the problems this causes have been suggested this paper presents a novel one. It describes one possible use of haptics as a channel for communication between users. User's telepointers are considered as haptic avatars and interactions such as haptically pushing and pulling each other are afforded. The use of homing forces to locate other users is also discussed, as is a proximity sensation based on viscosity. Evaluation of this system is currently underway
Enabling collaboration in virtual reality navigators
In this paper we characterize a feature superset for Collaborative
Virtual Reality Environments (CVRE), and derive a component
framework to transform stand-alone VR navigators into full-fledged
multithreaded collaborative environments. The contributions of our
approach rely on a cost-effective and extensible technique for
loading software components into separate POSIX threads for
rendering, user interaction and network communications, and adding a
top layer for managing session collaboration. The framework recasts
a VR navigator under a distributed peer-to-peer topology for scene
and object sharing, using callback hooks for broadcasting remote
events and multicamera perspective sharing with avatar interaction.
We validate the framework by applying it to our own ALICE VR
Navigator. Experimental results show that our approach has good
performance in the collaborative inspection of complex models.Postprint (published version
From ‘hands up’ to ‘hands on’: harnessing the kinaesthetic potential of educational gaming
Traditional approaches to distance learning and the student learning journey have focused on closing the gap between the experience of off-campus students and their on-campus peers. While many initiatives have sought to embed a sense of community, create virtual learning environments and even build collaborative spaces for team-based assessment and presentations, they are limited by technological innovation in terms of the types of learning styles they support and develop. Mainstream gaming development – such as with the Xbox Kinect and Nintendo Wii – have a strong element of kinaesthetic learning from early attempts to simulate impact, recoil, velocity and other environmental factors to the more sophisticated movement-based games which create a sense of almost total immersion and allow untethered (in a technical sense) interaction with the games’ objects, characters and other players. Likewise, gamification of learning has become a critical focus for the engagement of learners and its commercialisation, especially through products such as the Wii Fit.
As this technology matures, there are strong opportunities for universities to utilise gaming consoles to embed levels of kinaesthetic learning into the student experience – a learning style which has been largely neglected in the distance education sector. This paper will explore the potential impact of these technologies, to broadly imagine the possibilities for future innovation in higher education
Beyond cute: exploring user types and design opportunities of virtual reality pet games
Virtual pet games, such as handheld games like Tamagotchi or video games like Petz, provide players with artificial pet companions or entertaining pet-raising simulations. Prior research has found that virtual pets have the potential to promote learning, collaboration, and empathy among users. While virtual reality (VR) has become an increasingly popular game medium, litle is known about users' expectations regarding game avatars, gameplay, and environments for VR-enabled pet games. We surveyed 780 respondents in an online survey and interviewed 30 participants to understand users' motivation, preferences, and game behavior in pet games played on various medium, and their expectations for VR pet games. Based on our findings, we generated three user types that reflect users' preferences and gameplay styles in VR pet games. We use these types to highlight key design opportunities and recommendations for VR pet games
Analysis domain model for shared virtual environments
The field of shared virtual environments, which also
encompasses online games and social 3D environments, has a
system landscape consisting of multiple solutions that share great functional overlap. However, there is little system interoperability between the different solutions. A shared virtual environment has an associated problem domain that is highly complex raising difficult challenges to the development process, starting with the architectural design of the underlying system. This paper has two main contributions. The first contribution is a broad domain analysis of shared virtual environments, which enables developers to have a better understanding of the whole rather than the part(s). The second contribution is a reference domain model for discussing and describing solutions - the Analysis Domain Model
- …