3,668 research outputs found

    Holdable Haptic Device for 4-DOF Motion Guidance

    Full text link
    Hand-held haptic devices can allow for greater freedom of motion and larger workspaces than traditional grounded haptic devices. They can also provide more compelling haptic sensations to the users' fingertips than many wearable haptic devices because reaction forces can be distributed over a larger area of skin far away from the stimulation site. This paper presents a hand-held kinesthetic gripper that provides guidance cues in four degrees of freedom (DOF). 2-DOF tangential forces on the thumb and index finger combine to create cues to translate or rotate the hand. We demonstrate the device's capabilities in a three-part user study. First, users moved their hands in response to haptic cues before receiving instruction or training. Then, they trained on cues in eight directions in a forced-choice task. Finally, they repeated the first part, now knowing what each cue intended to convey. Users were able to discriminate each cue over 90% of the time. Users moved correctly in response to the guidance cues both before and after the training and indicated that the cues were easy to follow. The results show promise for holdable kinesthetic devices in haptic feedback and guidance for applications such as virtual reality, medical training, and teleoperation.Comment: Submitted to IEEE World Haptics Conference 201

    Using immersive audio and vibration to enhance remote diagnosis of mechanical failure in uncrewed vessels.

    Get PDF
    There is increasing interest in the maritime industry in the potential use of uncrewed vessels to improve the efficiency and safety of maritime operations. This leads to a number of questions relating to the maintenance and repair of mechanical systems, in particular, critical propulsion systems which if a failure occurs could endanger the vessel. While control data is commonly monitored remotely, engineers on board ship also employ a wide variety of sensory feedback such as sound and vibration to diagnose the condition of systems, and these are often not replicated in remote monitoring. In order to assess the potential for enhancement of remote monitoring and diagnosis, this project simulated an engine room (ER) based on a real vessel in Unreal Engine 4 for the HTC ViveTM VR headset. Audio was recorded from the vessel, with mechanical faults synthesized to create a range of simulated failures. In order to simulate operational requirements, the system was remotely fed data from an external server. The system allowed users to view normal control room data, listen to the overall sound of the space presented spatially over loudspeakers, isolate the sound of particular machinery components, and feel the vibration of machinery through a body worn vibration transducer. Users could scroll through a 10-hour time history of system performance, including audio, vibration and data for snapshots at hourly intervals. Seven experienced marine engineers were asked to assess several scenarios for potential faults in different elements of the ER. They were assessed both quantitatively regarding correct fault identification, and qualitatively in order to assess their perception of usability of the system. Users were able to diagnose simulated mechanical failures with a high degree of accuracy, mainly utilising audio and vibration stimuli, and reported specifically that the immersive audio and vibration improved realism and increased their ability to diagnose system failures from a remote location

    A Review of Smart Materials in Tactile Actuators for Information Delivery

    Full text link
    As the largest organ in the human body, the skin provides the important sensory channel for humans to receive external stimulations based on touch. By the information perceived through touch, people can feel and guess the properties of objects, like weight, temperature, textures, and motion, etc. In fact, those properties are nerve stimuli to our brain received by different kinds of receptors in the skin. Mechanical, electrical, and thermal stimuli can stimulate these receptors and cause different information to be conveyed through the nerves. Technologies for actuators to provide mechanical, electrical or thermal stimuli have been developed. These include static or vibrational actuation, electrostatic stimulation, focused ultrasound, and more. Smart materials, such as piezoelectric materials, carbon nanotubes, and shape memory alloys, play important roles in providing actuation for tactile sensation. This paper aims to review the background biological knowledge of human tactile sensing, to give an understanding of how we sense and interact with the world through the sense of touch, as well as the conventional and state-of-the-art technologies of tactile actuators for tactile feedback delivery

    A Review of Non-Invasive Haptic Feedback stimulation Techniques for Upper Extremity Prostheses

    Get PDF
    A sense of touch is essential for amputees to reintegrate into their social and work life. The design of the next generation of the prostheses will have the ability to effectively convey the tactile information between the amputee and the artificial limbs. This work reviews non-invasive haptic feedback stimulation techniques to convey the tactile information from the prosthetic hand to the amputee’s brain. Various types of actuators that been used to stimulate the patient’s residual limb for different types of artificial prostheses in previous studies have been reviewed in terms of functionality, effectiveness, wearability and comfort. The non-invasive hybrid feedback stimulation system was found to be better in terms of the stimulus identification rate of the haptic prostheses’ users. It can be conclude that integrating hybrid haptic feedback stimulation system with the upper limb prostheses leads to improving its acceptance among users

    Effects of Haptic Feedback on the Wrist during Virtual Manipulation

    Full text link
    As an alternative to thimble devices for the fingertips, we investigate haptic systems that apply stimulus to the user's forearm. Our aim is to provide effective interaction with virtual objects, despite the lack of co-location of virtual and real-world contacts, while taking advantage of relatively large skin area and ease of mounting on the forearm. We developed prototype wearable haptic devices that provide skin deformation in the normal and shear directions, and performed a user study to determine the effects of haptic feedback in different directions and at different locations near the wrist during virtual manipulation. Participants performed significantly better while discriminating stiffness values of virtual objects with normal forces compared to shear forces. We found no differences in performance or participant preferences with regard to stimulus on the dorsal, ventral, or both sides of the forearm.Comment: 7 pages, submitted conference paper for IEEE Haptics Symposium 202

    Effects of Haptic Feedback on the Wrist during Virtual Manipulation

    Get PDF
    We propose a haptic system for virtual manipulation to provide feedback on the user's forearm instead of the fingertips. In addition to visual rendering of the manipulation with virtual fingertips, we employ a device to deliver normal or shear skin-stretch at multiple points near the wrist. To understand how design parameters influence the experience, we investigated the effect of the number and location of sensory feedback on stiffness perception. Participants compared stiffness values of virtual objects, while the haptic bracelet provided interaction feedback on the dorsal, ventral, or both sides of the wrist. Stiffness discrimination judgments and duration, as well as qualitative results collected verbally, indicate no significant difference in stiffness perception with stimulation at different and multiple locations.Comment: 2 pages, work-in-progress paper on haptics symposium, 202

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die Informationsüberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den Sinneskanälen aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschätzen, wenn mehrere Aktivitäten gleichzeitig ablaufen. Zum Beispiel können Menschen das Geräusch eines herannahenden Autos überhören, wenn sie über die Straße gehen und dabei auf ihr Smartphone schauen. Einige Sinneskanäle können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeinträchtigt sein. Unter den Sinneskanälen wird Berührung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren Berührungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen für die Navigation, Führung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile Benutzeroberfläche um den Kopf herum präsentiert, um einen möglicherweise beeinträchtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut für taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphärische Oberfläche bietet, die die Darstellung präziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natürlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen Anwendungsfällen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgeführt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der Fähigkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) Ansätzen zur Gestaltung taktiler Muster für komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und Präsenzgrades von Virtual-Reality-Anwendungen und (e) der Unterstützung von Menschen mit Sehbehinderungen bei der Führung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusätzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine Größenordnung präziser ist als der bisherige Stand der Technik, der einen taktilen Gürtel als Feedback-Modalität verwendete. Die Fähigkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die Lebensqualität vieler Menschen mit Sehbehinderungen verbessern
    • …
    corecore