491 research outputs found

    A Review of Smart Materials in Tactile Actuators for Information Delivery

    Full text link
    As the largest organ in the human body, the skin provides the important sensory channel for humans to receive external stimulations based on touch. By the information perceived through touch, people can feel and guess the properties of objects, like weight, temperature, textures, and motion, etc. In fact, those properties are nerve stimuli to our brain received by different kinds of receptors in the skin. Mechanical, electrical, and thermal stimuli can stimulate these receptors and cause different information to be conveyed through the nerves. Technologies for actuators to provide mechanical, electrical or thermal stimuli have been developed. These include static or vibrational actuation, electrostatic stimulation, focused ultrasound, and more. Smart materials, such as piezoelectric materials, carbon nanotubes, and shape memory alloys, play important roles in providing actuation for tactile sensation. This paper aims to review the background biological knowledge of human tactile sensing, to give an understanding of how we sense and interact with the world through the sense of touch, as well as the conventional and state-of-the-art technologies of tactile actuators for tactile feedback delivery

    Robotic simulators for tissue examination training with multimodal sensory feedback

    Get PDF
    Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators

    Tactile Displays with Parallel Mechanism

    Get PDF

    Virtual Surface Characteristics of a Tactile Display Using Magneto-Rheological Fluids

    Get PDF
    Virtual surface characteristics of tactile displays are investigated to characterize the feeling of human touch for a haptic interface application. In order to represent the tactile feeling, a prototype tactile display incorporating Magneto-Rheological (MR) fluid has been developed. Tactile display devices simulate the finger’s skin to feel the sensations of contact such as compliance, friction, and topography of the surface. Thus, the tactile display can provide information on the surface of an organic tissue to the surgeon in virtual reality. In order to investigate the compliance feeling of a human finger’s touch, normal force responses of a tactile display under various magnetic fields have been assessed. Also, shearing friction force responses of the tactile display are investigated to simulate the action of finger dragging on the surface. Moreover, different matrix arrays of magnetic poles are applied to form the virtual surface topography. From the results, different tactile feelings are observed according to the applied magnetic field strength as well as the arrays of magnetic poles combinations. This research presents a smart tactile display technology for virtual surfaces

    HapBead: on-skin microfluidic haptic interface using tunable bead

    Get PDF
    On-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users’ fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, mixed reality haptics and visual-haptic displays

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Fine-grained Haptics: Sensing and Actuating Haptic Primary Colours (force, vibration, and temperature)

    Get PDF
    This thesis discusses the development of a multimodal, fine-grained visual-haptic system for teleoperation and robotic applications. This system is primarily composed of two complementary components: an input device known as the HaptiTemp sensor (combines “Haptics” and “Temperature”), which is a novel thermosensitive GelSight-like sensor, and an output device, an untethered multimodal finegrained haptic glove. The HaptiTemp sensor is a visuotactile sensor that can sense haptic primary colours known as force, vibration, and temperature. It has novel switchable UV markers that can be made visible using UV LEDs. The switchable markers feature is a real novelty of the HaptiTemp because it can be used in the analysis of tactile information from gel deformation without impairing the ability to classify or recognise images. The use of switchable markers in the HaptiTemp sensor is the solution to the trade-off between marker density and capturing high-resolution images using one sensor. The HaptiTemp sensor can measure vibrations by counting the number of blobs or pulses detected per unit time using a blob detection algorithm. For the first time, temperature detection was incorporated into a GelSight-like sensor, making the HaptiTemp sensor a haptic primary colours sensor. The HaptiTemp sensor can also do rapid temperature sensing with a 643 ms response time for the 31°C to 50°C temperature range. This fast temperature response of the HaptiTemp sensor is comparable to the withdrawal reflex response in humans. This is the first time a sensor can trigger a sensory impulse that can mimic a human reflex in the robotic community. The HaptiTemp sensor can also do simultaneous temperature sensing and image classification using a machine vision camera—the OpenMV Cam H7 Plus. This capability of simultaneous sensing and image classification has not been reported or demonstrated by any tactile sensor. The HaptiTemp sensor can be used in teleoperation because it can communicate or transmit tactile analysis and image classification results using wireless communication. The HaptiTemp sensor is the closest thing to the human skin in tactile sensing, tactile pattern recognition, and rapid temperature response. In order to feel what the HaptiTemp sensor is touching from a distance, a corresponding output device, an untethered multimodal haptic hand wearable, is developed to actuate the haptic primary colours sensed by the HaptiTemp sensor. This wearable can communicate wirelessly and has fine-grained cutaneous feedback to feel the edges or surfaces of the tactile images captured by the HaptiTemp sensor. This untethered multimodal haptic hand wearable has gradient kinesthetic force feedback that can restrict finger movements based on the force estimated by the HaptiTemp sensor. A retractable string from an ID badge holder equipped with miniservos that control the stiffness of the wire is attached to each fingertip to restrict finger movements. Vibrations detected by the HaptiTemp sensor can be actuated by the tapping motion of the tactile pins or by a buzzing minivibration motor. There is also a tiny annular Peltier device, or ThermoElectric Generator (TEG), with a mini-vibration motor, forming thermo-vibro feedback in the palm area that can be activated by a ‘hot’ or ‘cold’ signal from the HaptiTemp sensor. The haptic primary colours can also be embedded in a VR environment that can be actuated by the multimodal hand wearable. A VR application was developed to demonstrate rapid tactile actuation of edges, allowing the user to feel the contours of virtual objects. Collision detection scripts were embedded to activate the corresponding actuator in the multimodal haptic hand wearable whenever the tactile matrix simulator or hand avatar in VR collides with a virtual object. The TEG also gets warm or cold depending on the virtual object the participant has touched. Tests were conducted to explore virtual objects in 2D and 3D environments using Leap Motion control and a VR headset (Oculus Quest 2). Moreover, a fine-grained cutaneous feedback was developed to feel the edges or surfaces of a tactile image, such as the tactile images captured by the HaptiTemp sensor, or actuate tactile patterns in 2D or 3D virtual objects. The prototype is like an exoskeleton glove with 16 tactile actuators (tactors) on each fingertip, 80 tactile pins in total, made from commercially available P20 Braille cells. Each tactor can be controlled individually to enable the user to feel the edges or surfaces of images, such as the high-resolution tactile images captured by the HaptiTemp sensor. This hand wearable can be used to enhance the immersive experience in a virtual reality environment. The tactors can be actuated in a tapping manner, creating a distinct form of vibration feedback as compared to the buzzing vibration produced by a mini-vibration motor. The tactile pin height can also be varied, creating a gradient of pressure on the fingertip. Finally, the integration of the high-resolution HaptiTemp sensor, and the untethered multimodal, fine-grained haptic hand wearable is presented, forming a visuotactile system for sensing and actuating haptic primary colours. Force, vibration, and temperature sensing tests with corresponding force, vibration, and temperature actuating tests have demonstrated a unified visual-haptic system. Aside from sensing and actuating haptic primary colours, touching the edges or surfaces of the tactile images captured by the HaptiTemp sensor was carried out using the fine-grained cutaneous feedback of the haptic hand wearable
    • 

    corecore