13 research outputs found

    Mid-Air Haptics for Control Interfaces

    Get PDF
    Control interfaces and interactions based on touch-less gesture tracking devices have become a prevalent research topic in both industry and academia. Touch-less devices offer a unique interaction immediateness that makes them ideal for applications where direct contact with a physical controller is not desirable. On the other hand, these controllers inherently lack active or passive haptic feedback to inform users about the results of their interaction. Mid-air haptic interfaces, such as those using focused ultrasound waves, can close the feedback loop and provide new tools for the design of touch-less, un-instrumented control interactions. The goal of this workshop is to bring together the growing mid-air haptic research community to identify and discuss future challenges in control interfaces and their application in AR/VR, automotive, music, robotics and teleoperation

    UltraBots: Large-Area Mid-Air Haptics for VR with Robotically Actuated Ultrasound Transducers

    Full text link
    We introduce UltraBots, a system that combines ultrasound haptic feedback and robotic actuation for large-area mid-air haptics for VR. Ultrasound haptics can provide precise mid-air haptic feedback and versatile shape rendering, but the interaction area is often limited by the small size of the ultrasound devices, restricting the possible interactions for VR. To address this problem, this paper introduces a novel approach that combines robotic actuation with ultrasound haptics. More specifically, we will attach ultrasound transducer arrays to tabletop mobile robots or robotic arms for scalable, extendable, and translatable interaction areas. We plan to use Sony Toio robots for 2D translation and/or commercially available robotic arms for 3D translation. Using robotic actuation and hand tracking measured by a VR HMD (e.g., Oculus Quest), our system can keep the ultrasound transducers underneath the user's hands to provide on-demand haptics. We demonstrate applications with workspace environments, medical training, education and entertainment.Comment: UIST 2022 SI

    Multi-point STM: Effects of Drawing Speed and Number of Focal Points on Users’ Responses using Ultrasonic Mid-Air Haptics

    Get PDF
    Spatiotemporal modulation (STM) is used to render tactile patterns with ultrasound arrays. Previous research only explored the effects of single-point STM parameters, such as drawing speed (Vd). Here we explore the effects of multi-point STM on both perceptual (intensity) and emotional (valence/arousal) responses. This introduces a new control parameter for STM - the number of focal points (Nfp) – on top of conventional STM parameter (Vd). Our results from a study with 30 participants showed a negative effect of Nfp on perceived intensity and arousal, but no significant effects on valence. We also found the effects of Vd still aligned with prior results for single-point, even when different Nfp were used, suggesting that effects observed from single-point also apply to multi-point STM. We finally derive recommendations, such as using single-point STM to produce stimuli with higher intensity and/or arousal, or using multi-point STM for milder and more relaxing (less arousing) experience

    Multisensory Integration as per Technological Advances: A Review

    Get PDF
    Multisensory integration research has allowed us to better understand how humans integrate sensory information to produce a unitary experience of the external world. However, this field is often challenged by the limited ability to deliver and control sensory stimuli, especially when going beyond audio–visual events and outside laboratory settings. In this review, we examine the scope and challenges of new technology in the study of multisensory integration in a world that is increasingly characterized as a fusion of physical and digital/virtual events. We discuss multisensory integration research through the lens of novel multisensory technologies and, thus, bring research in human–computer interaction, experimental psychology, and neuroscience closer together. Today, for instance, displays have become volumetric so that visual content is no longer limited to 2D screens, new haptic devices enable tactile stimulation without physical contact, olfactory interfaces provide users with smells precisely synchronized with events in virtual environments, and novel gustatory interfaces enable taste perception through levitating stimuli. These technological advances offer new ways to control and deliver sensory stimulation for multisensory integration research beyond traditional laboratory settings and open up new experimentations in naturally occurring events in everyday life experiences. Our review then summarizes these multisensory technologies and discusses initial insights to introduce a bridge between the disciplines in order to advance the study of multisensory integration

    Haptic Interfaces for Virtual Reality: Challenges and Research Directions

    Get PDF
    The sense of touch (haptics) has been applied in several areas such as tele-haptics, telemedicine, training, education, and entertainment. As of today, haptics is used and explored by researchers in many more multi-disciplinary and inter-disciplinary areas. The utilization of haptics is also enhanced with other forms of media such as audio, video, and even sense of smell. For example, the use of haptics is prevalent in virtual reality environments to increase the immersive experience for users. However, while there has been significant progress within haptic interfaces throughout the years, there are still many challenges that limit their development. This review highlights haptic interfaces for virtual reality ranging from wearables, handhelds, encountered-type devices, and props, to mid-air approaches. We discuss and summarize these approaches, along with interaction domains such as skin receptors, object properties, and force. This is in order to arrive at design challenges for each interface, along with existing research gaps

    Assisting Navigation and Object Selection with Vibrotactile Cues

    Get PDF
    Our lives have been drastically altered by information technology in the last decades, leading to evolutionary mismatches between human traits and the modern environment. One particular mismatch occurs when visually demanding information technology overloads the perceptual, cognitive or motor capabilities of the human nervous system. This information overload could be partly alleviated by complementing visual interaction with haptics. The primary aim of this thesis was to investigate how to assist movement control with vibrotactile cues. Vibrotactile cues refer to technologymediated vibrotactile signals that notify users of perceptual events, propose users to make decisions, and give users feedback from actions. To explore vibrotactile cues, we carried out five experiments in two contexts of movement control: navigation and object selection. The goal was to find ways to reduce information load in these tasks, thus helping users to accomplish the tasks more effectively. We employed measurements such as reaction times, error rates, and task completion times. We also used subjective rating scales, short interviews, and free-form participant comments to assess the vibrotactile assisted interactive systems. The findings of this thesis can be summarized as follows. First, if the context of movement control allows the use of both feedback and feedforward cues, feedback cues are a reasonable first option. Second, when using vibrotactile feedforward cues, using low-level abstractions and supporting the interaction with other modalities can keep the information load as low as possible. Third, the temple area is a feasible actuation location for vibrotactile cues in movement control, including navigation cues and object selection cues with head turns. However, the usability of the area depends on contextual factors such as spatial congruency, the actuation device, and the pace of the interaction task

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility
    corecore