40 research outputs found

    A Review of Virtual Reality Based Training Simulators for Orthopaedic Surgery

    Get PDF
    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 total hip replacement pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator

    Virtual Reality Simulation of Glenoid Reaming Procedure

    Get PDF
    Glenoid reaming is a bone machining operation in Total Shoulder Arthroplasty (TSA) in which the glenoid bone is resurfaced to make intimate contact with implant undersurface. While this step is crucial for the longevity of TSA, many surgeons find it technically challenging. With the recent advances in Virtual Reality (VR) simulations, it has become possible to realistically replicate complicated operations without any need for patients or cadavers, and at the same time, provide quantitative feedback to improve surgeons\u27 psycho-motor skills. In light of these advantages, the current thesis intends to develop tools and methods required for construction of a VR simulator for glenoid reaming, in an attempt to construct a reliable tool for preoperative training and planning for surgeons involved with TSA. Towards the end, this thesis presents computational algorithms to appropriately represent surgery tool and bone in the VR environment, determine their intersection and compute realistic haptic feedback based on the intersections. The core of the computations is constituted by sampled geometrical representations of both objects. In particular, point cloud model of the tool and voxelized model of bone - that is derived from Computed Tomography (CT) images - are employed. The thesis shows how to efficiently construct these models and adequately represent them in memory. It also elucidates how to effectively use these models to rapidly determine tool-bone collisions and account for bone removal momentarily. Furthermore, the thesis applies cadaveric experimental data to study the mechanics of glenoid reaming and proposes a realistic model for haptic computations. The proposed model integrates well with the developed computational tools, enabling real-time haptic and graphic simulation of glenoid reaming. Throughout the thesis, a particular emphasis is placed upon computational efficiency, especially on the use of parallel computing using Graphics Processing Units (GPUs). Extensive implementation results are also presented to verify the effectiveness of the developments. Not only do the results of this thesis advance the knowledge in the simulation of glenoid reaming, but they also rigorously contribute to the broader area of surgery simulation, and can serve as a step forward to the wider implementation of VR technology in surgeon training programs

    Virtual and Augmented Reality in Medical Education

    Get PDF
    Virtual reality (VR) and augmented reality (AR) are two contemporary simulation models that are currently upgrading medical education. VR provides a 3D and dynamic view of structures and the ability of the user to interact with them. The recent technological advances in haptics, display systems, and motion detection allow the user to have a realistic and interactive experience, enabling VR to be ideal for training in hands-on procedures. Consequently, surgical and other interventional procedures are the main fields of application of VR. AR provides the ability of projecting virtual information and structures over physical objects, thus enhancing or altering the real environment. The integration of AR applications in the understanding of anatomical structures and physiological mechanisms seems to be beneficial. Studies have tried to demonstrate the validity and educational effect of many VR and AR applications, in many different areas, employed via various hardware platforms. Some of them even propose a curriculum that integrates these methods. This chapter provides a brief history of VR and AR in medicine, as well as the principles and standards of their function. Finally, the studies that show the effect of the implementation of these methods in different fields of medical training are summarized and presented

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    Investigating Precise Control in Spatial Interactions: Proxemics, Kinesthetics, and Analytics

    Get PDF
    Augmented and Virtual Reality (AR/VR) technologies have reshaped the way in which we perceive the virtual world. In fact, recent technological advancements provide experiences that make the physical and virtual worlds almost indistinguishable. However, the physical world affords subtle sensorimotor cues which we subconsciously utilize to perform simple and complex tasks in our daily lives. The lack of this affordance in existing AR/VR systems makes it difficult for their mainstream adoption over conventional 2D2D user interfaces. As a case in point, existing spatial user interfaces (SUI) lack the intuition to perform tasks in a manner that is perceptually familiar to the physical world. The broader goal of this dissertation lies in facilitating an intuitive spatial manipulation experience, specifically for motor control. We begin by investigating the role of proximity to an action on precise motor control in spatial tasks. We do so by introducing a new SUI called the Clock-Maker's Work-Space (CMWS), with the goal of enabling precise actions close to the body, akin to the physical world. On evaluating our setup in comparison to conventional mixed-reality interfaces, we find CMWS to afford precise actions for bi-manual spatial tasks. We further compare our SUI with a physical manipulation task and observe similarities in user behavior across both tasks. We subsequently narrow our focus on studying precise spatial rotation. We utilize haptics, specifically force-feedback (kinesthetics) for augmenting fine motor control in spatial rotational task. By designing three kinesthetic rotation metaphors, we evaluate precise rotational control with and without haptic feedback for 3D shape manipulation. Our results show that haptics-based rotation algorithms allow for precise motor control in 3D space, also, help reduce hand fatigue. In order to understand precise control in its truest form, we investigate orthopedic surgery training from the point of analyzing bone-drilling tasks. We designed a hybrid physical-virtual simulator for bone-drilling training and collected physical data for analyzing precise drilling action. We also developed a Laplacian based performance metric to help expert surgeons evaluate the resident training progress across successive years of orthopedic residency

    Haptic Training Simulator for Pedicle Screw Insertion in Scoliosis Surgery

    Get PDF
    This thesis develops a haptic training simulator that imitates the sensations experienced by a surgeon in pedicle screw insertions in a scoliosis surgery. Pedicle screw insertion is a common treatment for fixing spinal deformities in idiopathic scoliosis. Surgeons using the free hand technique are guided primarily by haptic feedback. A vital step in this free hand technique is the use of a probe to make a channel through the vertebrae pedicle. This is a sensitive process which carries risk of serious mechanical, neurological and vascular complications. Surgeons are currently trained using cadavers or live patients. Cadavers often have vertebrae that are softer than the real surgeons would typically encounter, while training on live patients carries the obvious issue of increased risk of complications to the patient. In this thesis, a haptic virtual reality simulator is designed and studied as a training tool for surgeons in this procedure. Creating a pathway through the pedicle by the free-hand technique is composed of two main degrees of freedom: rotation and linear progression. The rotary stage of the device which was developed by a previous student, is enhanced in this research by adding hardware, improving the haptic model and proposing techniques to couple the rotary and linear degree of freedom. Haptic model parameters for a spine surgery with normal bone density are then clinically tuned within a user study. Over ten surgeons of varying experience levels used the simulator and were able to change various parameters in order to tune the simulator to what felt most realistic. The surgeons also evaluated the simulator for its feasibility and usefulness. Four research questions were investigated. First, can a reference set of values be found that replicate the surgeon's interpretation of the surgical scenario? Second, how are the rotary stage parameters influenced in the presence of linear effects? Third, do the results differ across different expertise levels? Finally, can the simulator serve as a useful tool in the education of surgical trainees for teaching channel creation in pedicle screw insertion? Statistical analysis are carried out to examine the research questions. The results indicates the feasibility of the simulator for surgical education

    Automated Planning with Multivariate Shape Descriptors for Fibular Transfer in Mandibular Reconstruction

    Get PDF
    Objective: This paper introduces methods to automate preoperative planning of fibular segmentation and placement for mandibular reconstruction with fibular flaps. Methods: Preoperative virtual planning for this type of surgery has been performed by manual adjustment of many parameters, or based upon a single feature of the reconstruction. We propose a novel planning procedure formulated as a non-convex minimization problem of an objective function using the multilateral shape descriptors. Results: A retrospective study was designed and 120 reconstruction plans were reproduced using computed tomography images with oral surgeons. The proposed automated planning model was quantitatively compared with both the existing model and the surgeons’ plans. Conclusion: The results show that the developed framework attains stable automated planning that agrees with the surgeons’ decisions. Significance: This method addresses trade-off problems between symmetric reconstruction and restoration of the native contour of the mandible

    Advances on Mechanics, Design Engineering and Manufacturing III

    Get PDF
    This open access book gathers contributions presented at the International Joint Conference on Mechanics, Design Engineering and Advanced Manufacturing (JCM 2020), held as a web conference on June 2–4, 2020. It reports on cutting-edge topics in product design and manufacturing, such as industrial methods for integrated product and process design; innovative design; and computer-aided design. Further topics covered include virtual simulation and reverse engineering; additive manufacturing; product manufacturing; engineering methods in medicine and education; representation techniques; and nautical, aeronautics and aerospace design and modeling. The book is organized into four main parts, reflecting the focus and primary themes of the conference. The contributions presented here not only provide researchers, engineers and experts in a range of industrial engineering subfields with extensive information to support their daily work; they are also intended to stimulate new research directions, advanced applications of the methods discussed and future interdisciplinary collaborations
    corecore