2,677 research outputs found

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe

    Human-Robot Team Interaction Through Wearable Haptics for Cooperative Manipulation

    Get PDF
    The interaction of robot teams and single human in teleoperation scenarios is beneficial in cooperative tasks, for example the manipulation of heavy and large objects in remote or dangerous environments. The main control challenge of the interaction is its asymmetry, arising because robot teams have a relatively high number of controllable degrees of freedom compared to the human operator. Therefore, we propose a control scheme that establishes the interaction on spaces of reduced dimensionality taking into account the low number of human command and feedback signals imposed by haptic devices. We evaluate the suitability of wearable haptic fingertip devices for multi-contact teleoperation in a user study. The results show that the proposed control approach is appropriate for human-robot team interaction and that the wearable haptic fingertip devices provide suitable assistance in cooperative manipulation tasks

    3DTouch: A wearable 3D input device with an optical sensor and a 9-DOF inertial measurement unit

    Full text link
    We present 3DTouch, a novel 3D wearable input device worn on the fingertip for 3D manipulation tasks. 3DTouch is designed to fill the missing gap of a 3D input device that is self-contained, mobile, and universally working across various 3D platforms. This paper presents a low-cost solution to designing and implementing such a device. Our approach relies on relative positioning technique using an optical laser sensor and a 9-DOF inertial measurement unit. 3DTouch is self-contained, and designed to universally work on various 3D platforms. The device employs touch input for the benefits of passive haptic feedback, and movement stability. On the other hand, with touch interaction, 3DTouch is conceptually less fatiguing to use over many hours than 3D spatial input devices. We propose a set of 3D interaction techniques including selection, translation, and rotation using 3DTouch. An evaluation also demonstrates the device's tracking accuracy of 1.10 mm and 2.33 degrees for subtle touch interaction in 3D space. Modular solutions like 3DTouch opens up a whole new design space for interaction techniques to further develop on.Comment: 8 pages, 7 figure

    A fabric-based approach for wearable haptics

    Get PDF
    In recent years, wearable haptic systems (WHS) have gained increasing attention as a novel and exciting paradigm for human-robot interaction (HRI).These systems can be worn by users, carried around, and integrated in their everyday lives, thus enabling a more natural manner to deliver tactile cues.At the same time, the design of these types of devices presents new issues: the challenge is the correct identification of design guidelines, with the two-fold goal of minimizing system encumbrance and increasing the effectiveness and naturalness of stimulus delivery.Fabrics can represent a viable solution to tackle these issues.They are specifically thought “to be worn”, and could be the key ingredient to develop wearable haptic interfaces conceived for a more natural HRI.In this paper, the author will review some examples of fabric-based WHS that can be applied to different body locations, and elicit different haptic perceptions for different application fields.Perspective and future developments of this approach will be discussed

    CobotTouch: AR-based Interface with Fingertip-worn Tactile Display for Immersive Operation/Control of Collaborative Robots

    Full text link
    Complex robotic tasks require human collaboration to benefit from their high dexterity. Frequent human-robot interaction is mentally demanding and time-consuming. Intuitive and easy-to-use robot control interfaces reduce the negative influence on workers, especially inexperienced users. In this paper, we present CobotTouch, a novel intuitive robot control interface with fingertip haptic feedback. The proposed interface consists of a projected Graphical User Interface on the robotic arm to control the position of the robot end-effector based on gesture recognition, and a wearable haptic interface to deliver tactile feedback on the user's fingertips. We evaluated the user's perception of the designed tactile patterns presented by the haptic interface and the intuitiveness of the proposed system for robot control in a use case. The results revealed a high average recognition rate of 75.25\% for the tactile patterns. An average NASA Task Load Index (TLX) indicated small mental and temporal demands proving a high level of the intuitiveness of CobotTouch for interaction with collaborative robots.Comment: 12 pages, 11 figures, Accepted paper in AsiaHaptics 202
    corecore