1,253 research outputs found
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Levitating Particle Displays with Interactive Voxels
Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research
Designing 3D scenarios and interaction tasks for immersive environments
In the world of today, immersive reality such as virtual and mixed reality, is one of the
most attractive research fields. Virtual Reality, also called VR, has a huge potential
to be used in in scientific and educational domains by providing users with real-time
interaction or manipulation. The key concept in immersive technologies to provide a
high level of immersive sensation to the user, which is one of the main challenges in
this field. Wearable technologies play a key role to enhance the immersive sensation
and the degree of embodiment in virtual and mixed reality interaction tasks.
This project report presents an application study where the user interacts with
virtual objects, such as grabbing objects, open or close doors and drawers while wearing
a sensory cyberglove developed in our lab (Cyberglove-HT). Furthermore, it presents
the development of a methodology that provides inertial measurement unit(IMU)-based
gesture recognition.
The interaction tasks and 3D immersive scenarios were designed in Unity 3D.
Additionally, we developed an inertial sensor-based gesture recognition by employing
an Long short-term memory (LSTM) network. In order to distinguish the effect of
wearable technologies in the user experience in immersive environments, we made an
experimental study comparing the Cyberglove-HT to standard VR controllers (HTC
Vive Controller). The quantitive and subjective results indicate that we were able
to enhance the immersive sensation and self embodiment with the Cyberglove-HT. A
publication resulted from this work [1] which has been developed in the framework
of the R&D project Human Tracking and Perception in Dynamic Immersive Rooms
(HTPDI
Too Hot to Handle: An Evaluation of the Effect of Thermal Visual Representation on User Grasping Interaction in Virtual Reality
Influence of interaction fidelity and rendering quality on perceived user experience have been largely explored in Virtual Reality (VR). However, differences in interaction choices triggered by these rendering cues have not yet been explored. We present a study analysing the effect of thermal visual cues and contextual information on 50 participants' approach to grasp and move a virtual mug. This study comprises 3 different temperature cues (baseline empty, hot and cold) and 4 contextual representations; all embedded in a VR scenario. We evaluate 2 different hand representations (abstract and human) to assess grasp metrics. Results show temperature cues influenced grasp location, with the mug handle being predominantly grasped with a smaller grasp aperture for the hot condition, while the body and top were preferred for baseline and cold conditions
- …