50 research outputs found
An Overview of Wearable Haptic Technologies and Their Performance in Virtual Object Exploration.
We often interact with our environment through manual handling of objects and exploration of their properties. Object properties (OP), such as texture, stiffness, size, shape, temperature, weight, and orientation provide necessary information to successfully perform interactions. The human haptic perception system plays a key role in this. As virtual reality (VR) has been a growing field of interest with many applications, adding haptic feedback to virtual experiences is another step towards more realistic virtual interactions. However, integrating haptics in a realistic manner, requires complex technological solutions and actual user-testing in virtual environments (VEs) for verification. This review provides a comprehensive overview of recent wearable haptic devices (HDs) categorized by the OP exploration for which they have been verified in a VE. We found 13 studies which specifically addressed user-testing of wearable HDs in healthy subjects. We map and discuss the different technological solutions for different OP exploration which are useful for the design of future haptic object interactions in VR, and provide future recommendations
Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback
The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden.
In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications
Haptic wearables as sensory replacement, sensory augmentation and trainer - a review
Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful
electronics have enabled the recent development of wearable systems aimed to improve function for individuals
with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical
applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn
devices that interact with skin directly or through clothing and can be used in natural environments outside a
laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an
amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common
in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This
review found that wearable haptic devices improved function for a variety of clinical applications including:
rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables
development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and
biomechanical compliance for long-term usage
Haptics in Robot-Assisted Surgery: Challenges and Benefits
Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts
Design and fabrication of flexible tactile sensing and feedback interface for communication by deafblind people
Humans generally interact and communicate using five basic sensory modalities and mainly through vision, touch and audio. However, this does not work for deafblind people as they have both impaired hearing and vision modalities, and hence rely on touch-sensing. This necessitates the development of alternative means that allows them to independently interact and communicate. To do this requires a solution which has the capability for tactile sensing and feedback. Therefore, tactile interface becomes a critical component of any assistive device usable by deafblind people for interaction and communication. Given that existing solutions mainly use rigid and commercial components, there is a need to tap into the advancements in flexible electronics in order develop more effective and conformable solutions. This research involves the development of flexible tactile communication interface usable in assistive communication devices for deafblind people. First, commercial sensors and actuators were utilised as a proof-of-concept and then four novel tactile interfaces were explored which include two similar touch-sensitive electromagnetic actuators, one capacitive tactile sensing array, and a facile flexible inductance-based pressure sensor.
The two fabricated touch-sensitive electromagnetic actuators (Type 1 and 2) are both based on electromagnetic principle and capable of simultaneous tactile sensing and feedback. Each comprises of a tandem combination of two main modules - the touch-sensing and the actuation module, with both modules integrated as a single device in each case. The actuation module employs a flexible planar spiral coil and a Neodymium magnet assembled in a soft Polydimethylsiloxane (PDMS) structure, while the touch-sensing module is a planar capacitive metal- insulator-metal structure of copper. The flexible coil (~17µm thick and with 45 turns) was fabricated on a Polyimide sheet using Lithographie Galvanoformung Abformung (LIGA) process. The results of characterisation of these actuators at frequencies ranging from 10Hz to 200Hz, shows a maximum displacement (~ 190µm) around 40Hz. Evaluation of this by 40 (20 deafblind and 20 sighted and hearing) participants show that they can feel vibration at this range.
Another tactile interface fabricated is an 8 x 8 capacitive tactile sensing array. The sensor was developed on a flexible Polyvinyl Chloride (PVC) sheet with column electrodes deposited on one side and row electrodes on the reverse side. It is intended for use as an assistive tactile communication interface for deafblind people who communicate using deafblind manual alphabets as well as the English block letters.
An inductance-based pressure sensor was also designed, fabricated and characterised for use as an input interface for finger Braille as well as other tactile communication methods for deafblind people. It was realised with a soft ferromagnetic elastomer and a 17µm-thick coil fabricated on a flexible 50 µm-thick polyimide sheet. The ferromagnetic elastomer acts as the core of the coil, which when pressed, sees the metal particles moving closer to each other, leading to changes in the inductance. The coil, with 75µm conductor and 25µm pitch, was also realised using LIGA micromolding technique. Seven different sensors were fabricated using different ratios (1:1, 1:2, 1:3, 1:5, 2:1, 3:1, and 5:1) of Ecoflex to Iron particles. The performance of each sensor was investigated and generally, sensors with higher Iron particles gave better sensitivity, linear as well as dynamic range. In comparison with all other fabricated sensors, the sensor made with 1:5DD was recommended for application as a tactile interface
Advancing proxy-based haptic feedback in virtual reality
This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback für Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die Fähigkeit zuschreibt, Materie kontrollieren zu können. Um dieser Vision näher zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische Eindrücke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die für die Realisierung eines ultimativen Displays nötige Flexibilität zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunächst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. Anschließend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle über haptische Eindrücke von Nutzern erhalten können. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinästhetische Eindrücke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und Veränderung des Luftwiderstandes von Requisiten vermitteln. Zusätzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken können. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. Abschließend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine können
Contactless Haptic Display Through Magnetic Field Control
Haptic rendering enables people to touch, perceive, and manipulate virtual
objects in a virtual environment. Using six cascaded identical hollow disk
electromagnets and a small permanent magnet attached to an operator's finger,
this paper proposes and develops an untethered haptic interface through
magnetic field control. The concentric hole inside the six cascaded
electromagnets provides the workspace, where the 3D position of the permanent
magnet is tracked with a Microsoft Kinect sensor. The driving currents of six
cascaded electromagnets are calculated in real-time for generating the desired
magnetic force. Offline data from an FEA (finite element analysis) based
simulation, determines the relationship between the magnetic force, the driving
currents, and the position of the permanent magnet. A set of experiments
including the virtual object recognition experiment, the virtual surface
identification experiment, and the user perception evaluation experiment were
conducted to demonstrate the proposed system, where Microsoft HoloLens
holographic glasses are used for visual rendering. The proposed magnetic haptic
display leads to an untethered and non-contact interface for natural haptic
rendering applications, which overcomes the constraints of mechanical linkages
in tool-based traditional haptic devices
Haptics Rendering and Applications
There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future