6,021 research outputs found

    Mid-Air Haptics for Control Interfaces

    Get PDF
    Control interfaces and interactions based on touch-less gesture tracking devices have become a prevalent research topic in both industry and academia. Touch-less devices offer a unique interaction immediateness that makes them ideal for applications where direct contact with a physical controller is not desirable. On the other hand, these controllers inherently lack active or passive haptic feedback to inform users about the results of their interaction. Mid-air haptic interfaces, such as those using focused ultrasound waves, can close the feedback loop and provide new tools for the design of touch-less, un-instrumented control interactions. The goal of this workshop is to bring together the growing mid-air haptic research community to identify and discuss future challenges in control interfaces and their application in AR/VR, automotive, music, robotics and teleoperation

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Egocentric Reference Frame Bias In The Palmar Haptic Perception Of Surface Orientation

    Get PDF
    The effect of egocentric reference frames on palmar haptic perception of orientation was investigated in vertically separated locations in a sagittal plane. Reference stimuli to be haptically matched were presented either haptically (to the contralateral hand) or visually. As in prior investigations of haptic orientation perception, a strong egocentric bias was found, such that haptic orientation matches made in the lower part of personal space were much lower (i.e., were perceived as being higher) than those made at eye level. The same haptic bias was observed both when the reference surface to be matched was observed visually and when bimanual matching was used. These findings support the conclusion that, despite the presence of an unambiguous allocentric (gravitational) reference frame in vertical planes, haptic orientation perception in the sagittal plane reflects an egocentric bias

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe

    Multi-touch 3D Exploratory Analysis of Ocean Flow Models

    Get PDF
    Modern ocean flow simulations are generating increasingly complex, multi-layer 3D ocean flow models. However, most researchers are still using traditional 2D visualizations to visualize these models one slice at a time. Properly designed 3D visualization tools can be highly effective for revealing the complex, dynamic flow patterns and structures present in these models. However, the transition from visualizing ocean flow patterns in 2D to 3D presents many challenges, including occlusion and depth ambiguity. Further complications arise from the interaction methods required to navigate, explore, and interact with these 3D datasets. We present a system that employs a combination of stereoscopic rendering, to best reveal and illustrate 3D structures and patterns, and multi-touch interaction, to allow for natural and efficient navigation and manipulation within the 3D environment. Exploratory visual analysis is facilitated through the use of a highly-interactive toolset which leverages a smart particle system. Multi-touch gestures allow users to quickly position dye emitting tools within the 3D model. Finally, we illustrate the potential applications of our system through examples of real world significance

    I’m sensing in the rain: spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users

    Get PDF
    Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR
    • …
    corecore