2,105 research outputs found

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Prop-Based Haptic Interaction with Co-location and Immersion: an Automotive Application

    Get PDF
    Most research on 3D user interfaces aims at providing only a single sensory modality. One challenge is to integrate several sensory modalities into a seamless system while preserving each modality's immersion and performance factors. This paper concerns manipulation tasks and proposes a visuo-haptic system integrating immersive visualization, tactile force and tactile feedback with co-location. An industrial application is presented

    On the Collaboration of an Automatic Path-Planner and a Human User for Path-Finding in Virtual Industrial Scenes

    Get PDF
    This paper describes a global interactive framework enabling an automatic path-planner and a user to collaborate for finding a path in cluttered virtual environments. First, a collaborative architecture including the user and the planner is described. Then, for real time purpose, a motion planner divided into different steps is presented. First, a preliminary workspace discretization is done without time limitations at the beginning of the simulation. Then, using these pre-computed data, a second algorithm finds a collision free path in real time. Once the path is found, an haptic artificial guidance on the path is provided to the user. The user can then influence the planner by not following the path and automatically order a new path research. The performances are measured on tests based on assembly simulation in CAD scenes

    Evaluation of Presence in Virtual Environments: Haptic Vest and User's Haptic Skills

    Get PDF
    This paper presents the integration of a haptic vest with a multimodal virtual environment, consisting of video, audio, and haptic feedback, with the main objective of determining how users, who interact with the virtual environment, benefit from tactile and thermal stimuli provided by the haptic vest. Some experiments are performed using a game application of a train station after an explosion. The participants of this experiment have to move inside the environment, while receiving several stimuli to check if any improvement in presence or realism in that environment is reflected on the vest. This is done by comparing the experimental results with those similar scenarios, obtained without haptic feedback. These experiments are carried out by three groups of participants who are classified on the basis of their experience in haptics and virtual reality devices. Some differences among the groups have been found, which can be related to the levels of realism and synchronization of all the elements in the multimodal environment that fulfill the expectations and maximum satisfaction level. According to the participants in the experiment, two different levels of requirements are to be defined by the system to comply with the expectations of professional and conventional users

    A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning

    Get PDF
    As a result of several governments closing educational facilities in reaction to the COVID-19 pandemic in 2020, almost 80% of the world’s students were not in school for several weeks. Schools and universities are thus increasing their efforts to leverage educational resources and provide possibilities for remote learning. A variety of educational programs, platforms, and technologies are now accessible to support student learning; while these tools are important for society, they are primarily concerned with the dissemination of theoretical material. There is a lack of support for hands-on laboratory work and practical experience. This is particularly important for all disciplines related to science, technology, engineering, and mathematics (STEM), where labs and pedagogical assets must be continuously enhanced in order to provide effective study programs. In this study, we describe a unique perspective to achieving multi-sensory learning through the integration of virtual and augmented reality (VR/AR) with haptic wearables in STEM education. We address the implications of a novel viewpoint on established pedagogical notions. We want to encourage worldwide efforts to make fully immersive, open, and remote laboratory learning a reality.publishedVersio

    A perspective review on integrating VR/AR with haptics into STEM education for multi-sensory learning

    Get PDF
    As a result of several governments closing educational facilities in reaction to the COVID-19 pandemic in 2020, almost 80% of the world’s students were not in school for several weeks. Schools and universities are thus increasing their efforts to leverage educational resources and provide possibilities for remote learning. A variety of educational programs, platforms, and technologies are now accessible to support student learning; while these tools are important for society, they are primarily concerned with the dissemination of theoretical material. There is a lack of support for hands-on laboratory work and practical experience. This is particularly important for all disciplines related to science, technology, engineering, and mathematics (STEM), where labs and pedagogical assets must be continuously enhanced in order to provide effective study programs. In this study, we describe a unique perspective to achieving multi-sensory learning through the integration of virtual and augmented reality (VR/AR) with haptic wearables in STEM education. We address the implications of a novel viewpoint on established pedagogical notions. We want to encourage worldwide efforts to make fully immersive, open, and remote laboratory learning a reality.European Union through the Erasmus+ Program under Grant 2020-1-NO01-KA203-076540, project title Integrating virtual and AUGMENTED reality with WEARable technology into engineering EDUcation (AugmentedWearEdu), https://augmentedwearedu.uia.no/ [34] (accessed on 27 March 2022). This work was also supported by the Top Research Centre Mechatronics (TRCM), University of Agder (UiA), Norwa
    corecore