190 research outputs found
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Advancing proxy-based haptic feedback in virtual reality
This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback für Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die Fähigkeit zuschreibt, Materie kontrollieren zu können. Um dieser Vision näher zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische Eindrücke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die für die Realisierung eines ultimativen Displays nötige Flexibilität zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunächst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. Anschließend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle über haptische Eindrücke von Nutzern erhalten können. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinästhetische Eindrücke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und Veränderung des Luftwiderstandes von Requisiten vermitteln. Zusätzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken können. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. Abschließend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine können
ARC: Alignment-based Redirection Controller for Redirected Walking in Complex Environments
We present a novel redirected walking controller based on alignment that
allows the user to explore large and complex virtual environments, while
minimizing the number of collisions with obstacles in the physical environment.
Our alignment-based redirection controller, ARC, steers the user such that
their proximity to obstacles in the physical environment matches the proximity
to obstacles in the virtual environment as closely as possible. To quantify a
controller's performance in complex environments, we introduce a new metric,
Complexity Ratio (CR), to measure the relative environment complexity and
characterize the difference in navigational complexity between the physical and
virtual environments. Through extensive simulation-based experiments, we show
that ARC significantly outperforms current state-of-the-art controllers in its
ability to steer the user on a collision-free path. We also show through
quantitative and qualitative measures of performance that our controller is
robust in complex environments with many obstacles. Our method is applicable to
arbitrary environments and operates without any user input or parameter
tweaking, aside from the layout of the environments. We have implemented our
algorithm on the Oculus Quest head-mounted display and evaluated its
performance in environments with varying complexity. Our project website is
available at https://gamma.umd.edu/arc/
Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality
Virtual reality applications often make use of motion tracking to incorporate physical hand movements into interaction techniques for selection and manipulation techniques of virtual objects. To increase realism and allow direct hand interaction, real-world physical objects can be aligned with virtual objects to provide tactile feedback and physical grasping. However, unless a physical space is custom configured to match a specific virtual reality experience, the ability to perfectly match the physical and virtual objects is limited. Our research addresses this challenge by studying methods that allow one physical object to be mapped to multiple virtual objects that can exist as different virtual locations in an egocentric reference frame. We study two such techniques: one that introduces a static translational offset between the virtual and physical hand before a hand reach, and one that dynamically interpolates the position of the virtual hand during a reaching motion. We conducted two controlled experiments to assess how the two techniques affect reaching effectiveness, comfort, and ability to adapt to the remapping techniques when reaching for objects with different types of mismatches between physical and virtual locations. In addition, we present a case study to demonstrate how the hand remapping techniques could be used in an immersive game application to support realistic hand interaction while optimizing usability. With our results, we discuss future considerations for how to best implement passive haptics with remapping techniques and provide guidelines for effective implementation
Move or Push? Studying Pseudo-Haptic Perceptions Obtained with Motion or Force Input
Pseudo-haptics techniques are interesting alternatives for generating haptic
perceptions, which entails the manipulation of haptic perception through the
appropriate alteration of primarily visual feedback in response to body
movements. However, the use of pseudo-haptics techniques with a motion-input
system can sometimes be limited. This paper investigates a novel approach for
extending the potential of pseudo-haptics techniques in virtual reality (VR).
The proposed approach utilizes a reaction force from force-input as a
substitution of haptic cue for the pseudo-haptic perception. The paper
introduced a manipulation method in which the vertical acceleration of the
virtual hand is controlled by the extent of push-in of a force sensor. Such a
force-input manipulation of a virtual body can not only present pseudo-haptics
with less physical spaces and be used by more various users including
physically handicapped people, but also can present the reaction force
proportional to the user's input to the user. We hypothesized that such a
haptic force cue would contribute to the pseudo-haptic perception. Therefore,
the paper endeavors to investigate the force-input pseudo-haptic perception in
a comparison with the motion-input pseudo-haptics. The paper compared
force-input and motion-input manipulation in a point of achievable range and
resolution of pseudo-haptic weight. The experimental results suggest that the
force-input manipulation successfully extends the range of perceptible
pseudo-weight by 80\% in comparison to the motion-input manipulation. On the
other hand, it is revealed that the motion-input manipulation has 1 step larger
number of distinguishable weight levels and is easier to operate than the
force-input manipulation.Comment: This paper is now under review for IEEE Transactions on Visualization
and Computer Graphic
- …