466 research outputs found

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe

    Mixed reality temporal bone surgical dissector: mechanical design

    Get PDF

    Kinesthetic Haptics Sensing and Discovery with Bilateral Teleoperation Systems

    Get PDF
    In the mechanical engineering field of robotics, bilateral teleoperation is a classic but still increasing research topic. In bilateral teleoperation, a human operator moves the master manipulator, and a slave manipulator is controlled to follow the motion of the master in a remote, potentially hostile environment. This dissertation focuses on kinesthetic perception analysis in teleoperation systems. Design of the controllers of the systems is studied as the influential factor of this issue. The controllers that can provide different force tracking capability are compared using the same experimental protocol. A 6 DOF teleoperation system is configured as the system testbed. An innovative master manipulator is developed and a 7 DOF redundant manipulator is used as the slave robot. A singularity avoidance inverse kinematics algorithm is developed to resolve the redundancy of the slave manipulator. An experimental protocol is addressed and three dynamics attributes related to kineshtetic feedback are investigated: weight, center of gravity and inertia. The results support our hypothesis: the controller that can bring a better force feedback can improve the performance in the experiments

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    Visuohaptic Simulation of a Borescope for Aircraft Engine Inspection

    Get PDF
    Consisting of a long, fiber optic probe containing a small CCD camera controlled by hand-held articulation interface, a video borescope is used for remote visual inspection of hard to reach components in an aircraft. The knowledge and psychomotor skills, specifically the hand-eye coordination, required for effective inspection are hard to acquire through limited exposure to the borescope in aviation maintenance schools. Inexperienced aircraft maintenance technicians gain proficiency through repeated hands-on learning in the workplace along a steep learning curve while transitioning from the classroom to the workforce. Using an iterative process combined with focused user evaluations, this dissertation details the design, implementation and evaluation of a novel visuohaptic simulator for training novice aircraft maintenance technicians in the task of engine inspection using a borescope. First, we describe the development of the visual components of the simulator, along with the acquisition and modeling of a representative model of a {PT-6} aircraft engine. Subjective assessments with both expert and novice aircraft maintenance engineers evaluated the visual realism and the control interfaces of the simulator. In addition to visual feedback, probe contact feedback is provided through a specially designed custom haptic interface that simulates tip contact forces as the virtual probe intersects with the {3D} model surfaces of the engine. Compared to other haptic interfaces, the custom design is unique in that it is inexpensive and uses a real borescope probe to simulate camera insertion and withdrawal. User evaluation of this simulator with probe tip feedback suggested a trend of improved performance with haptic feedback. Next, we describe the development of a physically-based camera model for improved behavioral realism of the simulator. Unlike a point-based camera, the enhanced camera model simulates the interaction of the borescope probe, including multiple points of contact along the length of the probe. We present visual comparisons of a real probe\u27s motion with the simulated probe model and develop a simple algorithm for computing the resultant contact forces. User evaluation comparing our custom haptic device with two commonly available haptic devices, the Phantom Omni and the Novint Falcon, suggests that the improved camera model as well as probe contact feedback with the 3D engine model plays a significant role in the overall engine inspection process. Finally, we present results from a skill transfer study comparing classroom-only instruction with both simulator and hands-on training. Students trained using the simulator and the video borescope completed engine inspection using the real video borescope significantly faster than students who received classroom-only training. The speed improvements can be attributed to reduced borescope probe maneuvering time within the engine and improved psychomotor skills due to training. Given the usual constraints of limited time and resources, simulator training may provide beneficial skills needed by novice aircraft maintenance technicians to augment classroom instruction, resulting in a faster transition into the aviation maintenance workforce

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control

    The evaluation of a novel haptic machining VR-based process planning system using an original process planning usability method

    Get PDF
    This thesis provides an original piece of work and contribution to knowledge by creating a new process planning system; Haptic Aided Process Planning (HAPP). This system is based on the combination of haptics and virtual reality (VR). HAPP creates a simulative machining environment where Process plans are automatically generated from the real time logging of a user’s interaction. Further, through the application of a novel usability test methodology, a deeper study of how this approach compares to conventional process planning was undertaken. An abductive research approach was selected and an iterative and incremental development methodology chosen. Three development cycles were undertaken with evaluation studies carried out at the end of each. Each study, the pre-pilot, pilot and industrial, identified progressive refinements to both the usability of HAPP and the usability evaluation method itself. HAPP provided process planners with an environment similar to which they are already familiar. Visual images were used to represent tools and material whilst a haptic interface enabled their movement and positioning by an operator in a manner comparable to their native setting. In this way an intuitive interface was developed that allowed users to plan the machining of parts consisting of features that can be machined on a pillar drill, 21/2D axis milling machine or centre lathe. The planning activities included single or multiple set ups, fixturing and sequencing of cutting operations. The logged information was parsed and output to a process plan including route sheets, operation sheets, tool lists and costing information, in a human readable format. The system evaluation revealed that HAPP, from an expert planners perspective is perceived to be 70% more satisfying to use, 66% more efficient in completing process plans, primarily due to the reduced cognitive load, is more effective producing a higher quality output of information and is 20% more learnable than a traditional process planning approach

    Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions

    Get PDF
    In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation. The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation. This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering. A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach. Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robot’s space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts

    ReachMAN to help sub-acute patients training reaching and manipulation

    No full text
    Conventional rehabilitation after stroke, consisting in one-to-one practice with the therapist, is labor-intensive and subjective. Furthermore, there is evidence that increasing training would benefit the motor function of stroke survivors, though the available resources do not allow it. Training with dedicated robotic devices promises to address these problems and to promote motivation through therapeutic games. The goal of this project is to develop a simple robotic system to assist rehabilitation that could easily be integrated in existing hospital environments and rehabilitation centers. A study was first carried out to analyze the kinematics of hand movements while performing representative activities of daily living. Results showed that movements were confined to one plane so can be trained using a robot with less degrees-of-freedom (DOF). Hence ReachMAN, a compact 3 DOF robot based on an endpoint based approach, was developed to train reaching, forearm pronosupination and grasping, independently or simultaneously. ReachMAN's exercises were developed using games based on software thereby facilitating active participation from patients. Visual, haptic and performance feedback were provided to increase motivation. Tuneable levels of difficulty were provided to suit patient's ability. A pilot study with three subjects was first conducted to evaluate the potential use of ReachMAN as a rehabilitation tool and to determine suitable settings for training. Following positive results from a pilot study, a clinical study was initiated to investigate the effect of rehabilitation using ReachMAN. Preliminary results of 6 subjects show an increase in patients upper limb motor activity, range of movements, smoothness and reduction in movement duration. Subjects reported to be motivated with the robot training and felt that the robot helped in their recovery. The results of this thesis suggest that a compact and simple robot such as ReachMAN can be used to enhance recovery in sub-acute stroke patients
    corecore