11,522 research outputs found

    Haptic feedback in mixed-reality environment

    Get PDF
    The training process in industries is assisted with computer solutions to reduce costs. Normally, computer systems created to simulate assembly or machine manipulation are implemented with traditional Human-Computer interfaces (keyboard, mouse, etc). But, this usually leads to systems that are far from the real procedures, and thus not efficient in term of training. Two techniques could improve this procedure: mixed-reality and haptic feedback. We propose in this paper to investigate the integration of both of them inside a single framework. We present the hardware used to design our training system. A feasibility study allows one to establish testing protocol. The results of these tests convince us that such system should not try to simulate realistically the interaction between real and virtual objects as if it was only real object

    Innovative mixed reality advanced manufacturing environment with haptic feedback

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)In immersive eLearning environments, it has been demonstrated that incorporating haptic feedback improves the software's pedagogical effectiveness. Due to this and recent advancements in virtual reality (VR) and mixed reality (MR) environments, more immersive, authentic, and viable pedagogical tools have been created. However, the advanced manufacturing industry has not fully embraced mixed reality training tools. There is currently a need for effective haptic feedback techniques in advanced manufacturing environments. The MR-AVML, a proposed CNC milling machine training tool, is designed to include two forms of haptic feedback, thereby providing users with a natural and intuitive experience. This experience is achieved by tasking users with running a virtual machine seen through the Microsoft HoloLens and interacting with a physical representation of the machine controller. After conducting a pedagogical study on the environment, it was found that the MR-AVML was 6.06% more effective than a version of the environment with no haptic feedback, and only 1.35% less effective than hands-on training led by an instructor. This shows that the inclusion of haptic feedback in an advanced manufacturing training environment can improve pedagogical effectiveness

    Haptic Feedback Relocation from the Fingertips to the Wrist for Two-Finger Manipulation in Virtual Reality

    Full text link
    Relocation of haptic feedback from the fingertips to the wrist has been considered as a way to enable haptic interaction with mixed reality virtual environments while leaving the fingers free for other tasks. We present a pair of wrist-worn tactile haptic devices and a virtual environment to study how various mappings between fingers and tactors affect task performance. The haptic feedback rendered to the wrist reflects the interaction forces occurring between a virtual object and virtual avatars controlled by the index finger and thumb. We performed a user study comparing four different finger-to-tactor haptic feedback mappings and one no-feedback condition as a control. We evaluated users' ability to perform a simple pick-and-place task via the metrics of task completion time, path length of the fingers and virtual cube, and magnitudes of normal and shear forces at the fingertips. We found that multiple mappings were effective, and there was a greater impact when visual cues were limited. We discuss the limitations of our approach and describe next steps toward multi-degree-of-freedom haptic rendering for wrist-worn devices to improve task performance in virtual environments.Comment: 6 pages, 9 figures, 1 table, submitted and accepted to the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2022 Conferenc

    Design and Development of a Multimodal Vest for Virtual Immersion and Guidance

    Get PDF
    This paper is focused on the development of a haptic vest to enhance immersion and realism in virtual environments, through vibrotactile feedback. The first steps to achieve touch-based communication are presented in order to set an actuation method based on vibration motors. Resulting vibrotactile patterns helping users to move inside virtual reality (VR). The research investigates human torso resolution and perception of vibration patterns, evaluating different kind of actuators at different locations on the vest. Finally, determining an appropriate distribution of vibration patterns allowed the generation of sensations that, for instance, help to guide in a mixed or virtual reality environment

    Facilitating serendipitous communication, interaction and collaborative flow in mixed realities while working remote.

    Get PDF
    Working remotely can often lead to fatigue and stress which in turn reduces productivity. The author will explore the concept that with proper communication, environments that promote serendipitous meeting and effective collaboration, can result in happier workers. The author will also explore how speculative design can improve remote work and provide vision for future work scenarios. This concept will focus on the development of interaction design for use with mixed realities, 3D capture systems and mid-air haptic technology to create a seamless and engaging remote working environment. This paper will show how work culture and communication can affect mental health and productivity; why people isolate while working remotely; how interaction and proper feedback is important to work flow; and the importance of meeting face to face while collaborating. The author purposes a possible solution and application in capturing user interaction data to provide a seamless virtual work environment. This system would utilize 3D capture, mid-air haptic control, and mixed reality technology to allow remote workers to fulfill work related tasks while also maintaining healthy communication and mental health. The final outcome is a personal haptics console and an augmented reality (AR) headset with an integrated camera system

    Perceiving Mass in Mixed Reality through Pseudo-Haptic Rendering of Newton's Third Law

    Get PDF
    In mixed reality, real objects can be used to interact with virtual objects. However, unlike in the real world, real objects do not encounter any opposite reaction force when pushing against virtual objects. The lack of reaction force during manipulation prevents users from perceiving the mass of virtual objects. Although this could be addressed by equipping real objects with force-feedback devices, such a solution remains complex and impractical.In this work, we present a technique to produce an illusion of mass without any active force-feedback mechanism. This is achieved by simulating the effects of this reaction force in a purely visual way. A first study demonstrates that our technique indeed allows users to differentiate light virtual objects from heavy virtual objects. In addition, it shows that the illusion is immediately effective, with no prior training. In a second study, we measure the lowest mass difference (JND) that can be perceived with this technique. The effectiveness and ease of implementation of our solution provides an opportunity to enhance mixed reality interaction at no additional cost

    Prop-Based Haptic Interaction with Co-location and Immersion: an Automotive Application

    Get PDF
    Most research on 3D user interfaces aims at providing only a single sensory modality. One challenge is to integrate several sensory modalities into a seamless system while preserving each modality's immersion and performance factors. This paper concerns manipulation tasks and proposes a visuo-haptic system integrating immersive visualization, tactile force and tactile feedback with co-location. An industrial application is presented

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii
    • 

    corecore