141 research outputs found

    Virtual reality training and assessment in laparoscopic rectum surgery

    Get PDF
    Background: Virtual-reality (VR) based simulation techniques offer an efficient and low cost alternative to conventional surgery training. This article describes a VR training and assessment system in laparoscopic rectum surgery. Methods: To give a realistic visual performance of interaction between membrane tissue and surgery tools, a generalized cylinder based collision detection and a multi-layer mass-spring model are presented. A dynamic assessment model is also designed for hierarchy training evaluation. Results: With this simulator, trainees can operate on the virtual rectum with both visual and haptic sensation feedback simultaneously. The system also offers surgeons instructions in real time when improper manipulation happens. The simulator has been tested and evaluated by ten subjects. Conclusions: This prototype system has been verified by colorectal surgeons through a pilot study. They believe the visual performance and the tactile feedback are realistic. It exhibits the potential to effectively improve the surgical skills of trainee surgeons and significantly shorten their learning curve. © 2014 John Wiley & Sons, Ltd

    Development and Validation of a Hybrid Virtual/Physical Nuss Procedure Surgical Trainer

    Get PDF
    With continuous advancements and adoption of minimally invasive surgery, proficiency with nontrivial surgical skills involved is becoming a greater concern. Consequently, the use of surgical simulation has been increasingly embraced by many for training and skill transfer purposes. Some systems utilize haptic feedback within a high-fidelity anatomically-correct virtual environment whereas others use manikins, synthetic components, or box trainers to mimic primary components of a corresponding procedure. Surgical simulation development for some minimally invasive procedures is still, however, suboptimal or otherwise embryonic. This is true for the Nuss procedure, which is a minimally invasive surgery for correcting pectus excavatum (PE) – a congenital chest wall deformity. This work aims to address this gap by exploring the challenges of developing both a purely virtual and a purely physical simulation platform of the Nuss procedure and their implications in a training context. This work then describes the development of a hybrid mixed-reality system that integrates virtual and physical constituents as well as an augmentation of the haptic interface, to carry out a reproduction of the primary steps of the Nuss procedure and satisfy clinically relevant prerequisites for its training platform. Furthermore, this work carries out a user study to investigate the system’s face, content, and construct validity to establish its faithfulness as a training platform

    Accuracy of navigated cam resection in femoroacetabular impingement: A randomised controlled trial.

    Get PDF
    BACKGROUND: The main cause for revision hip arthroscopy surgery is incomplete bony resection of femoroacetabular impingement (FAI). This study aimed to compare the cam resection accuracy via the conventional hip arthroscopy technique with the navigation technique. METHODS: Two prospectively randomized groups were recruited: navigated (n = 15) and conventional (n = 14). A pre-operative CT and post-operative MRI scan were obtained in all cases to compare alpha angle, range of motion simulation and determine a pre-operative 3D surgical resection plan. RESULTS: Post-operatively, the mean maximal alpha angle improved significantly in the navigated group compared with the conventional group (55°vs.66°; P = 0.023), especially in the 12 o' clock position (45°vs.60°; P = 0.041). However, positioning time and radiation exposure were significantly longer in the navigated group. CONCLUSION: Navigated surgery is effective for patients with cam type FAI in helping restore normal anatomy, however, not without drawbacks. Larger studies will be required to validate our results.Jan Van Houcke was supported by a doctoral grant of the Research Foundation‐Flanders

    FPGA-based High-Performance Collision Detection: An Enabling Technique for Image-Guided Robotic Surgery

    Get PDF
    Collision detection, which refers to the computational problem of finding the relative placement or con-figuration of two or more objects, is an essential component of many applications in computer graphics and robotics. In image-guided robotic surgery, real-time collision detection is critical for preserving healthy anatomical structures during the surgical procedure. However, the computational complexity of the problem usually results in algorithms that operate at low speed. In this paper, we present a fast and accurate algorithm for collision detection between Oriented-Bounding-Boxes (OBBs) that is suitable for real-time implementation. Our proposed Sweep and Prune algorithm can perform a preliminary filtering to reduce the number of objects that need to be tested by the classical Separating Axis Test algorithm, while the OBB pairs of interest are preserved. These OBB pairs are re-checked by the Separating Axis Test algorithm to obtain accurate overlapping status between them. To accelerate the execution, our Sweep and Prune algorithm is tailor-made for the proposed method. Meanwhile, a high performance scalable hardware architecture is proposed by analyzing the intrinsic parallelism of our algorithm, and is implemented on FPGA platform. Results show that our hardware design on the FPGA platform can achieve around 8X higher running speed than the software design on a CPU platform. As a result, the proposed algorithm can achieve a collision frame rate of 1 KHz, and fulfill the requirement for the medical surgery scenario of Robot Assisted Laparoscopy.published_or_final_versio

    Virtual Reality Based Environment for Orthopedic Surgery (Veos)

    Get PDF
    The traditional way of teaching surgery involves students observing a ïżœliveïżœ surgery and then gradually assisting experienced surgeons. The creation of a Virtual Reality environment for orthopedic surgery (VEOS) can be beneficial in improving the quality of training while decreasing the time needed for training. Developing such virtual environments for educational and training purposes can supplement existing approaches. In this research, the design and development of a virtual reality based environment for orthopedic surgery is described. The scope of the simulation environment is restricted to an orthopedic surgery process known as Less Invasive Stabilization System (LISS) surgery. The primary knowledge source for the LISS surgical process was Miguel A. Pirela-Cruz (Head of Orthopedic Surgery and Rehabilitation, Texas Tech University Health Sciences Center (TTHSC)). The VEOS was designed and developed on a PC based platform. The developed VEOS was validated through interactions with surgical residents at TTHSC. Feedback from residents and our collaborator Miguel A. Pirela-Cruz was used to make necessary modifications to the surgical environment.Industrial Engineering & Managemen

    Modeling and rendering for development of a virtual bone surgery system

    Get PDF
    A virtual bone surgery system is developed to provide the potential of a realistic, safe, and controllable environment for surgical education. It can be used for training in orthopedic surgery, as well as for planning and rehearsal of bone surgery procedures...Using the developed system, the user can perform virtual bone surgery by simultaneously seeing bone material removal through a graphic display device, feeling the force via a haptic deice, and hearing the sound of tool-bone interaction --Abstract, page iii

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    Visualisation of articular motion in orthopaedics

    Get PDF
    Shouder replacement surgery is difficult surgery, with a relatively large risk on limited post-operative range of motion for patients. Adaptations to the anatomy of joints by placing a prosthesis affects the articulation of the joint. In this thesis we present a software system that simulates and visualises these effects. By loading a CT-scan of the shoulder of a patient we can simulate the range of motion of the joint and visualize limitations as a result of rigid structures of the joint. Surgeons may set up an operation plan and see what the consequences of the operation will be for the range of motion of the patient. The thesis investigates aspects that are relevant for the system. We describe an algorithm to convert the scan data to bone models. In addition, a validation experiment is presented. A method for motion registration and visualisation of recorded kinematic data is presented. Finally, this thesis concerns the application of the system to different surgical problems, such as hip arthroplasty and shoulder fractures.Annafonds Biomet Nederland Clinical Graphics DePuy JTE Johnson & Johnson Dutch Arthritis Association Litos/ Motek Medical TornierUBL - phd migration 201
    • 

    corecore