312 research outputs found

    Virtual Reality Simulation of Liver Biopsy with a Respiratory Component

    Get PDF
    International audienceThe field of computer-based simulators has grown exponentially in the last few decades, especially in Medicine. Advantages of medical simulators include: (1) provision of a platform where trainees can practice procedures without risk of harm to patients; (2) anatomical fidelity; (3) the ability to train in an environment wherein physiological behaviour is observed, something that is not permitted where in-vitro phantoms are used; (4) flexibility regarding anatomical and pathological variation of test cases that is valuable in the acquisition of experience; (5) quantification of metrics relating to task performance that can be used to monitor trainee performance throughout the learning curve; and (6) cost effectiveness. In this chapter, we will focus on the current state of the art of medical simulators, the relevant parameters required to design a medical simulator, the basic framework of the simulator, methods to produce a computer-based model of patient respiration and finally a description of a simulator for ultrasound guided for liver biopsy. The model that is discussed presents a framework that accurately simulates respiratory motion, allowing for the fine tuning of relevant parameters in order to produce a patient-specific breathing pattern that can then be incorporated into a simulation with real-rime haptic interaction. Thus work was conducted as part CRaIVE collaboration [1], whose aim is to develop simulators specific to interventional radiology

    Interventional radiology virtual simulator for liver biopsy

    Get PDF
    Purpose Training in Interventional Radiology currently uses the apprenticeship model, where clinical and technical skills of invasive procedures are learnt during practice in patients. This apprenticeship training method is increasingly limited by regulatory restrictions on working hours, concerns over patient risk through trainees’ inexperience and the variable exposure to case mix and emergencies during training. To address this, we have developed a computer-based simulation of visceral needle puncture procedures. Methods A real-time framework has been built that includes: segmentation, physically based modelling, haptics rendering, pseudo-ultrasound generation and the concept of a physical mannequin. It is the result of a close collaboration between different universities, involving computer scientists, clinicians, clinical engineers and occupational psychologists. Results The technical implementation of the framework is a robust and real-time simulation environment combining a physical platform and an immersive computerized virtual environment. The face, content and construct validation have been previously assessed, showing the reliability and effectiveness of this framework, as well as its potential for teaching visceral needle puncture. Conclusion A simulator for ultrasound-guided liver biopsy has been developed. It includes functionalities and metrics extracted from cognitive task analysis. This framework can be useful during training, particularly given the known difficulties in gaining significant practice of core skills in patients

    Real-time haptic modeling and simulation for prosthetic insertion

    Get PDF
    In this work a surgical simulator is produced which enables a training otologist to conduct a virtual, real-time prosthetic insertion. The simulator provides the Ear, Nose and Throat surgeon with real-time visual and haptic responses during virtual cochlear implantation into a 3D model of the human Scala Tympani (ST). The parametric model is derived from measured data as published in the literature and accounts for human morphological variance, such as differences in cochlear shape, enabling patient-specific pre- operative assessment. Haptic modeling techniques use real physical data and insertion force measurements, to develop a force model which mimics the physical behavior of an implant as it collides with the ST walls during an insertion. Output force profiles are acquired from the insertion studies conducted in the work, to validate the haptic model. The simulator provides the user with real-time, quantitative insertion force information and associated electrode position as user inserts the virtual implant into the ST model. The information provided by this study may also be of use to implant manufacturers for design enhancements as well as for training specialists in optimal force administration, using the simulator. The paper reports on the methods for anatomical modeling and haptic algorithm development, with focus on simulator design, development, optimization and validation. The techniques may be transferrable to other medical applications that involve prosthetic device insertions where user vision is obstructed

    Computer simulated needle manipulation of Chinese acupuncture with realistic haptic feedback.

    Get PDF
    Leung Ka Man.Thesis submitted in: August 2002.Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.Includes bibliographical references (leaves 81-84).Abstracts in English and Chinese.Abstract --- p.iiAcknowledgements --- p.ivContents --- p.vList of Figures --- p.viiiList of Tables --- p.xChapter 1. --- Introduction --- p.1Chapter 1.1 --- Surgical Needle Simulation --- p.4Chapter 1.1.1 --- Data Source --- p.5Chapter 1.1.2 --- Computer-aided training simulation --- p.6Chapter 1.1.3 --- Existing Systems --- p.8Chapter 1.2 --- Research Goal --- p.10Chapter 1.3 --- Organization of this Thesis --- p.12Chapter 2. --- Haptization of Needle Interactions --- p.13Chapter 2.1 --- Data Collection --- p.13Chapter 2.1.1 --- Force Measurement --- p.14Chapter 2.1.2 --- Data Correlation --- p.17Chapter 2.1.3 --- Expert Opinion --- p.18Chapter 2.2 --- Haptic Display Devices --- p.18Chapter 2.2.1 --- General-purpose Devices --- p.19Chapter 2.2.2 --- Tailor-made Devices --- p.20Chapter 2.3 --- Haptic Models for Tissues --- p.21Chapter 2.3.1 --- Stiffness Models --- p.21Chapter 2.3.2 --- Friction Models --- p.22Chapter 2.3.3 --- Modelling of needle operations --- p.23Chapter 2.4 --- Chapter Summary --- p.24Chapter 3. --- Haptic Rendering of Bi-directional Needle Manipulation --- p.25Chapter 3.1 --- Data Source and Pre-processing --- p.26Chapter 3.1.1 --- Virtual Body Surface Construction --- p.28Chapter 3.1.2 --- Tissue Mapping for Haptic Rendering --- p.29Chapter 3.2 --- The PHANToM´ёØ Haptic Device --- p.31Chapter 3.3 --- Force Profile Analysis --- p.33Chapter 3.4 --- Haptic Model Construction --- p.37Chapter 3.4.1 --- Skin --- p.41Chapter 3.4.2 --- Adipose Tissue --- p.48Chapter 3.4.3 --- Muscle --- p.49Chapter 3.4.4 --- Bone --- p.50Chapter 3.5 --- Force Composition --- p.51Chapter 3.5.1 --- Structure Weight Compensation --- p.52Chapter 3.5.2 --- Path Constraint Force --- p.52Chapter 3.5.3 --- Needle Axial Force --- p.53Chapter 3.6 --- Interactive Calibration --- p.60Chapter 3.7 --- Skin Deformation --- p.61Chapter 3.8 --- Chapter Summary --- p.63Chapter 4. --- Parallel Visual-Haptic Rendering --- p.64Chapter 4.1 --- Parallel Network Architecture --- p.64Chapter 4.2 --- Visual Rendering Pipeline --- p.65Chapter 4.3 --- Haptic Rendering Pipeline --- p.67Chapter 4.4 --- Chapter Summary --- p.67Chapter 5. --- User Interface --- p.68Chapter 5.1 --- Needle Practice --- p.68Chapter 5.1.1 --- Moving Mode --- p.69Chapter 5.1.2 --- Acupuncture Atlas --- p.70Chapter 5.1.3 --- Training Results --- p.70Chapter 5.1.4 --- User Controls --- p.71Chapter 5.2 --- Device Calibration --- p.72Chapter 5.3 --- Model Settings --- p.72Chapter 5.4 --- Chapter Summary --- p.72Chapter 6. --- Conclusion --- p.73Chapter 6.1 --- Research Summary --- p.73Chapter 6.2 --- Suggested Improvement --- p.74Chapter 6.3 --- Future Research Works --- p.75Appendix A: Mapping Table for Tissues --- p.76Appendix B: Incremental Viscoelastic Model --- p.78Appendix C: Model Parameter Values --- p.80Bibliography --- p.8

    Development and validation of real-time simulation of X-ray imaging with respiratory motion

    Get PDF
    International audienceWe present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: i) the respiration against anatomical data, and ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools

    Conception of a simulator for a TEL system in orthopaedic surgery.

    Get PDF
    Within a research project whose aim is to promote the learning of percutaneous operation in orthopedic surgery, we investigate some representation models of empirical, deductive, and perceptivo-gestural knowledge. From these models, we design an TEL system (Tecnological Enhaced Learning) This project belongs to a multidisciplinary field including computer, orthopedic surgery, medical imaging, didactic and cognitive sciences. The article presents the design principles of TEL with a particular interest in the development of a simulator. This simulator allows a virtual exercise interacting with the learner in visual, temporal and haptic dimension

    Haptic Training Simulator for Pedicle Screw Insertion in Scoliosis Surgery

    Get PDF
    This thesis develops a haptic training simulator that imitates the sensations experienced by a surgeon in pedicle screw insertions in a scoliosis surgery. Pedicle screw insertion is a common treatment for fixing spinal deformities in idiopathic scoliosis. Surgeons using the free hand technique are guided primarily by haptic feedback. A vital step in this free hand technique is the use of a probe to make a channel through the vertebrae pedicle. This is a sensitive process which carries risk of serious mechanical, neurological and vascular complications. Surgeons are currently trained using cadavers or live patients. Cadavers often have vertebrae that are softer than the real surgeons would typically encounter, while training on live patients carries the obvious issue of increased risk of complications to the patient. In this thesis, a haptic virtual reality simulator is designed and studied as a training tool for surgeons in this procedure. Creating a pathway through the pedicle by the free-hand technique is composed of two main degrees of freedom: rotation and linear progression. The rotary stage of the device which was developed by a previous student, is enhanced in this research by adding hardware, improving the haptic model and proposing techniques to couple the rotary and linear degree of freedom. Haptic model parameters for a spine surgery with normal bone density are then clinically tuned within a user study. Over ten surgeons of varying experience levels used the simulator and were able to change various parameters in order to tune the simulator to what felt most realistic. The surgeons also evaluated the simulator for its feasibility and usefulness. Four research questions were investigated. First, can a reference set of values be found that replicate the surgeon's interpretation of the surgical scenario? Second, how are the rotary stage parameters influenced in the presence of linear effects? Third, do the results differ across different expertise levels? Finally, can the simulator serve as a useful tool in the education of surgical trainees for teaching channel creation in pedicle screw insertion? Statistical analysis are carried out to examine the research questions. The results indicates the feasibility of the simulator for surgical education

    Development and Validation of a Hybrid Virtual/Physical Nuss Procedure Surgical Trainer

    Get PDF
    With continuous advancements and adoption of minimally invasive surgery, proficiency with nontrivial surgical skills involved is becoming a greater concern. Consequently, the use of surgical simulation has been increasingly embraced by many for training and skill transfer purposes. Some systems utilize haptic feedback within a high-fidelity anatomically-correct virtual environment whereas others use manikins, synthetic components, or box trainers to mimic primary components of a corresponding procedure. Surgical simulation development for some minimally invasive procedures is still, however, suboptimal or otherwise embryonic. This is true for the Nuss procedure, which is a minimally invasive surgery for correcting pectus excavatum (PE) – a congenital chest wall deformity. This work aims to address this gap by exploring the challenges of developing both a purely virtual and a purely physical simulation platform of the Nuss procedure and their implications in a training context. This work then describes the development of a hybrid mixed-reality system that integrates virtual and physical constituents as well as an augmentation of the haptic interface, to carry out a reproduction of the primary steps of the Nuss procedure and satisfy clinically relevant prerequisites for its training platform. Furthermore, this work carries out a user study to investigate the system’s face, content, and construct validity to establish its faithfulness as a training platform

    Augmented reality (AR) for surgical robotic and autonomous systems: State of the art, challenges, and solutions

    Get PDF
    Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future

    Software for Modeling Ultrasound Breast Cancer Imaging

    Get PDF
    Computer-based models are increasingly used in biomedical imaging research to clarify links between anatomical structure, imaging physics, and the information content of medical images. A few three-dimensional breast tissue software models have been developed for mammography simulations to optimize current mammography systems or to test novel systems. It would be beneficial in the development of ultrasound breast imaging to have a similar computational model for simulation. A three-dimensional breast anatomy model with the lobular ducts, periductal and intralobular loose fibrous tissue, interlobular dense fibrous tissue, fat, and skin has been implemented. The parenchymal density of the model can be varied from about 20 to 75% to represent a range of clinically relevant densities. The anatomical model was used as a foundation for a three-dimensional breast tumour model. The tumour model was designed to mimic the ultrasound appearance of features used in tumour classification. Simulated two-dimensional ultrasound images were synthesized from the models using a first-order k-space propagation simulator. Similar to clinical ultrasound images, the simulated images of normal breast tissue exhibited non-Rayleigh speckle in regions of interest consisting of primarily fatty, primarily fibroglandular, and mixed tissue types. The simulated images of tumours reproduced several shape and margin features used in breast tumour diagnosis. The ultrasound wavefront distortion produced in simulations using the anatomical model was evaluated and a second method of modeling wavefront distortion was also proposed in which 10 to 12 irregularly shaped, strongly scattering inclusions were iii superimposed on multiple parallel time-shift screens to create the screen-inclusion model. Simulations of planar pulsed wave propagation through the two proposed models, a conventional parallel time-shift screen model, and digitized breast tissue specimens were compared. The anatomical model and screen-inclusion model were able to produce arrival-time fluctuation and energy-level fluctuation characteristics comparable to the digitized tissue specimens that the parallel-screen model was unable to reproduce. This software is expected to be valuable for imaging simulations that require accurate and detailed representation of the ultrasound characteristics of breast tumours
    • …
    corecore