148 research outputs found

    Haptics in Robot-Assisted Surgery: Challenges and Benefits

    Get PDF
    Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts

    Design of a wearable fingertip haptic device for remote palpation: Characterisation and interface with a virtual environment

    Get PDF
    © 2018 Tzemanaki, Al, Melhuish and Dogramadzi. This paper presents the development of a wearable Fingertip Haptic Device (FHD) that can provide cutaneous feedback via a Variable Compliance Platform (VCP). The FHD includes an inertial measurement unit, which tracks the motion of the user's finger while its haptic functionality relies on two parameters: pressure in the VCP and its linear displacement towards the fingertip. The combination of these two features results in various conditions of the FHD, which emulate the remote object or surface stiffness properties. Such a device can be used in tele-operation, including virtual reality applications, where rendering the level of stiffness of different physical or virtual materials could provide a more realistic haptic perception to the user. The FHD stiffness representation is characterised in terms of resulting pressure and force applied to the fingertip created through the relationship of the two functional parameters - pressure and displacement of the VCP. The FHD was tested in a series of user studies to assess its potential to create a user perception of the object's variable stiffness. The viability of the FHD as a haptic device has been further confirmed by interfacing the users with a virtual environment. The developed virtual environment task required the users to follow a virtual path, identify objects of different hardness on the path and navigate away from "no-go" zones. The task was performed with and without the use of the variable compliance on the FHD. The results showed improved performance with the presence of the variable compliance provided by the FHD in all assessed categories and particularly in the ability to identify correctly between objects of different hardness

    Sensory substitution for force feedback recovery: A perception experimental study

    Get PDF
    Robotic-assisted surgeries are commonly used today as a more efficient alternative to traditional surgical options. Both surgeons and patients benefit from those systems, as they offer many advantages, including less trauma and blood loss, fewer complications, and better ergonomics. However, a remaining limitation of currently available surgical systems is the lack of force feedback due to the teleoperation setting, which prevents direct interaction with the patient. Once the force information is obtained by either a sensing device or indirectly through vision-based force estimation, a concern arises on how to transmit this information to the surgeon. An attractive alternative is sensory substitution, which allows transcoding information from one sensory modality to present it in a different sensory modality. In the current work, we used visual feedback to convey interaction forces to the surgeon. Our overarching goal was to address the following question: How should interaction forces be displayed to support efficient comprehension by the surgeon without interfering with the surgeon’s perception and workflow during surgery? Until now, the use the visual modality for force feedback has not been carefully evaluated. For this reason, we conducted an experimental study with two aims: (1) to demonstrate the potential benefits of using this modality and (2) to understand the surgeons’ perceptual preferences. The results derived from our study of 28 surgeons revealed a strong positive acceptance of the users (96%) using this modality. Moreover, we found that for surgeons to easily interpret the information, their mental model must be considered, meaning that the design of the visualizations should fit the perceptual and cognitive abilities of the end user. To our knowledge, this is the first time that these principles have been analyzed for exploring sensory substitution in medical robotics. Finally, we provide user-centered recommendations for the design of visual displays for robotic surgical systems.Peer ReviewedPostprint (author's final draft

    Visual Tactile Sensor Based Force Estimation for Position-Force Teleoperation

    Full text link
    Vision-based tactile sensors have gained extensive attention in the robotics community. The sensors are highly expected to be capable of extracting contact information i.e. haptic information during in-hand manipulation. This nature of tactile sensors makes them a perfect match for haptic feedback applications. In this paper, we propose a contact force estimation method using the vision-based tactile sensor DIGIT, and apply it to a position-force teleoperation architecture for force feedback. The force estimation is done by building a depth map for DIGIT gel surface deformation measurement and applying a regression algorithm on estimated depth data and ground truth force data to get the depth-force relationship. The experiment is performed by constructing a grasping force feedback system with a haptic device as a leader robot and a parallel robot gripper as a follower robot, where the DIGIT sensor is attached to the tip of the robot gripper to estimate the contact force. The preliminary results show the capability of using the low-cost vision-based sensor for force feedback applications.Comment: IEEE CBS 202

    Exodex Adam—A Reconfigurable Dexterous Haptic User Interface for the Whole Hand

    Get PDF
    Applications for dexterous robot teleoperation and immersive virtual reality are growing. Haptic user input devices need to allow the user to intuitively command and seamlessly “feel” the environment they work in, whether virtual or a remote site through an avatar. We introduce the DLR Exodex Adam, a reconfigurable, dexterous, whole-hand haptic input device. The device comprises multiple modular, three degrees of freedom (3-DOF) robotic fingers, whose placement on the device can be adjusted to optimize manipulability for different user hand sizes. Additionally, the device is mounted on a 7-DOF robot arm to increase the user’s workspace. Exodex Adam uses a front-facing interface, with robotic fingers coupled to two of the user’s fingertips, the thumb, and two points on the palm. Including the palm, as opposed to only the fingertips as is common in existing devices, enables accurate tracking of the whole hand without additional sensors such as a data glove or motion capture. By providing “whole-hand” interaction with omnidirectional force-feedback at the attachment points, we enable the user to experience the environment with the complete hand instead of only the fingertips, thus realizing deeper immersion. Interaction using Exodex Adam can range from palpation of objects and surfaces to manipulation using both power and precision grasps, all while receiving haptic feedback. This article details the concept and design of the Exodex Adam, as well as use cases where it is deployed with different command modalities. These include mixed-media interaction in a virtual environment, gesture-based telemanipulation, and robotic hand–arm teleoperation using adaptive model-mediated teleoperation. Finally, we share the insights gained during our development process and use case deployments

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 344)

    Get PDF
    This bibliography lists 125 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during January, 1989. Subject coverage includes: aerospace medicine and psychology, life support systems and controlled environments, safety equipment, exobiology and extraterrestrial life, and flight crew behavior and performance

    From passive tool holders to microsurgeons: safer, smaller, smarter surgical robots

    No full text

    Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits
    • …
    corecore