4 research outputs found

    A Review of Virtual Reality Based Training Simulators for Orthopaedic Surgery

    Get PDF
    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 total hip replacement pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator

    A review of virtual reality based training simulators for orthopaedic surgery

    Get PDF
    This is the author accepted manuscript. The final version is available from Elsevier via the DOI in this recordThis review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 hip replacements pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator.Wessex Academic Health Science Network (Wessex AHSN) Innovation and Wealth Creation Accelerator Fund 2014/15Bournemouth Universit

    Towards Skill Transfer via Learning-Based Guidance in Human-Robot Interaction

    Get PDF
    This thesis presents learning-based guidance (LbG) approaches that aim to transfer skills from human to robot. The approaches capture the temporal and spatial information of human motions and teach robot to assist human in human-robot collaborative tasks. In such physical human-robot interaction (pHRI) environments, learning from demonstrations (LfD) enables this transferring skill. Demonstrations can be provided through kinesthetic teaching and/or teleoperation. In kinesthetic teaching, humans directly guide robot’s body to perform a task while in teleoperation, demonstrations can be done through motion/vision-based systems or haptic devices. In this work, the LbG approaches are developed through kinesthetic teaching and teleoperation in both virtual and physical environments. First, this thesis compares and analyzes the capability of two types of statistical models, generative and discriminative, to generate haptic guidance (HG) forces as well as segment and recognize gestures for pHRI that can be used in virtual minimally invasive surgery (MIS) training. In this learning-based approach, the knowledge and experience of experts are modeled to improve the unpredictable motions of novice trainees. Two statistical models, hidden Markov model (HMM) and hidden Conditional Random Fields (HCRF), are used to learn gestures from demonstrations in a virtual MIS related task. The models are developed to automatically recognize and segment gestures as well as generate guidance forces. In practice phase, the guidance forces are adaptively calculated in real time regarding gesture similarities among user motion and the gesture models. Both statistical models can successfully capture the gestures of the user and provide adaptive HG, however, results show the superiority of HCRF, as a discriminative method, compared to HMM, as a generative method, in terms of user performance. In addition, LbG approaches are developed for kinesthetic HRI simulations that aim to transfer the skills of expert surgeons to resident trainees. The discriminative nature of HCRF is incorporated into the approach to produce LbG forces and discriminate the skill levels of users. To experimentally evaluate this kinesthetic-based approach, a femur bone drilling simulation is developed in which residents are provided haptic feedback based on real computed tomography (CT) data that enable them to feel the variable stiffness of bone layers. Orthepaedic surgeons require to adjust drilling force since bone layers have different stiffness. In the learning phase, using the simulation, an expert HCRF model is trained from expert surgeons demonstration to learn the stiffness variations of different bone layers. A novice HCRF model is also developed from the demonstration of novice residents to discriminate the skill levels of a new trainee. During the practice phase, the learning-based approach, which encoded the stiffness variations, guides the trainees to perform training tasks similar to experts motions. Finally, in contrast to other parts of the thesis, an LbG approach is developed through teleoperation in physical environment. The approach assists operators to navigate a teleoperated robot through a haptic steering wheel and a haptic gas pedal. A set of expert operator demonstrations are used to develop maneuvering skill model. The temporal and spatial variation of demonstrations are learned using HMM as the skill model. A modified Gaussian Mixture regression (GMR) in combination with the HMM is also developed to robustly produce the motion during reproduction. The GMR calculates outcome motions from a joint probability density function of data rather than directly model the regression function. In addition, the distance between the robot and obstacles is incorporated into the impedance control to generate guidance forces that also assist operators with avoiding obstacle collisions. Using different forms of variable impedance control, guidance forces are computed in real time with respect to the similarities between the maneuver of users and the skill model. This encourages users to navigate a robot similar to the expert operators. The results show that user performance is improved in terms of number of collisions, task completion time, and average closeness to obstacles

    HUMAN-ROBOT COLLABORATION IN ROBOTIC-ASSISTED SURGICAL TRAINING

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH
    corecore