2,198 research outputs found

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Perceiving Mass in Mixed Reality through Pseudo-Haptic Rendering of Newton's Third Law

    Get PDF
    In mixed reality, real objects can be used to interact with virtual objects. However, unlike in the real world, real objects do not encounter any opposite reaction force when pushing against virtual objects. The lack of reaction force during manipulation prevents users from perceiving the mass of virtual objects. Although this could be addressed by equipping real objects with force-feedback devices, such a solution remains complex and impractical.In this work, we present a technique to produce an illusion of mass without any active force-feedback mechanism. This is achieved by simulating the effects of this reaction force in a purely visual way. A first study demonstrates that our technique indeed allows users to differentiate light virtual objects from heavy virtual objects. In addition, it shows that the illusion is immediately effective, with no prior training. In a second study, we measure the lowest mass difference (JND) that can be perceived with this technique. The effectiveness and ease of implementation of our solution provides an opportunity to enhance mixed reality interaction at no additional cost

    Substitutional reality:using the physical environment to design virtual reality experiences

    Get PDF
    Experiencing Virtual Reality in domestic and other uncontrolled settings is challenging due to the presence of physical objects and furniture that are not usually defined in the Virtual Environment. To address this challenge, we explore the concept of Substitutional Reality in the context of Virtual Reality: a class of Virtual Environments where every physical object surrounding a user is paired, with some degree of discrepancy, to a virtual counterpart. We present a model of potential substitutions and validate it in two user studies. In the first study we investigated factors that affect participants' suspension of disbelief and ease of use. We systematically altered the virtual representation of a physical object and recorded responses from 20 participants. The second study investigated users' levels of engagement as the physical proxy for a virtual object varied. From the results, we derive a set of guidelines for the design of future Substitutional Reality experiences

    Haptic feedback in teleoperation in Micro-and Nano-Worlds.

    No full text
    International audienceRobotic systems have been developed to handle very small objects, but their use remains complex and necessitates long-duration training. Simulators, such as molecular simulators, can provide access to large amounts of raw data, but only highly trained users can interpret the results of such systems. Haptic feedback in teleoperation, which provides force-feedback to an operator, appears to be a promising solution for interaction with such systems, as it allows intuitiveness and flexibility. However several issues arise while implementing teleoperation schemes at the micro-nanoscale, owing to complex force-fields that must be transmitted to users, and scaling differences between the haptic device and the manipulated objects. Major advances in such technology have been made in recent years. This chapter reviews the main systems in this area and highlights how some fundamental issues in teleoperation for micro- and nano-scale applications have been addressed. The chapter considers three types of teleoperation, including: (1) direct (manipulation of real objects); (2) virtual (use of simulators); and (3) augmented (combining real robotic systems and simulators). Remaining issues that must be addressed for further advances in teleoperation for micro-nanoworlds are also discussed, including: (1) comprehension of phenomena that dictate very small object (< 500 micrometers) behavior; and (2) design of intuitive 3-D manipulation systems. Design guidelines to realize an intuitive haptic feedback teleoperation system at the micro-nanoscale level are proposed

    An aesthetics of touch: investigating the language of design relating to form

    Get PDF
    How well can designers communicate qualities of touch? This paper presents evidence that they have some capability to do so, much of which appears to have been learned, but at present make limited use of such language. Interviews with graduate designer-makers suggest that they are aware of and value the importance of touch and materiality in their work, but lack a vocabulary to fully relate to their detailed explanations of other aspects such as their intent or selection of materials. We believe that more attention should be paid to the verbal dialogue that happens in the design process, particularly as other researchers show that even making-based learning also has a strong verbal element to it. However, verbal language alone does not appear to be adequate for a comprehensive language of touch. Graduate designers-makers’ descriptive practices combined non-verbal manipulation within verbal accounts. We thus argue that haptic vocabularies do not simply describe material qualities, but rather are situated competences that physically demonstrate the presence of haptic qualities. Such competencies are more important than groups of verbal vocabularies in isolation. Design support for developing and extending haptic competences must take this wide range of considerations into account to comprehensively improve designers’ capabilities

    VR Haptics at Home: Repurposing Everyday Objects and Environment for Casual and On-Demand VR Haptic Experiences

    Full text link
    This paper introduces VR Haptics at Home, a method of repurposing everyday objects in the home to provide casual and on-demand haptic experiences. Current VR haptic devices are often expensive, complex, and unreliable, which limits the opportunities for rich haptic experiences outside research labs. In contrast, we envision that, by repurposing everyday objects as passive haptics props, we can create engaging VR experiences for casual uses with minimal cost and setup. To explore and evaluate this idea, we conducted an in-the-wild study with eight participants, in which they used our proof-of-concept system to turn their surrounding objects such as chairs, tables, and pillows at their own homes into haptic props. The study results show that our method can be adapted to different homes and environments, enabling more engaging VR experiences without the need for complex setup process. Based on our findings, we propose a possible design space to showcase the potential for future investigation.Comment: CHI 2023 Late-Breaking Wor
    • 

    corecore