220 research outputs found

    Virtual Reality Simulator for Training in Myringotomy with Tube Placement

    Get PDF
    Myringotomy refers to a surgical incision in the eardrum, and it is often followed by ventilation tube placement to treat middle-ear infections. The procedure is difficult to learn; hence, the objectives of this work were to develop a virtual-reality training simulator, assess its face and content validity, and implement quantitative performance metrics and assess construct validity. A commercial digital gaming engine (Unity3D) was used to implement the simulator with support for 3D visualization of digital ear models and support for major surgical tasks. A haptic arm co-located with the stereo scene was used to manipulate virtual surgical tools and to provide force feedback. A questionnaire was developed with 14 face validity questions focusing on realism and 6 content validity questions focusing on training potential. Twelve participants from the Department of Otolaryngology were recruited for the study. Responses to 12 of the 14 face validity questions were positive. One concern was with contact modeling related to tube insertion into the eardrum, and the second was with movement of the blade and forceps. The former could be resolved by using a higher resolution digital model for the eardrum to improve contact localization. The latter could be resolved by using a higher fidelity haptic device. With regard to content validity, 64% of the responses were positive, 21% were neutral, and 15% were negative. In the final phase of this work, automated performance metrics were programmed and a construct validity study was conducted with 11 participants: 4 senior Otolaryngology consultants and 7 junior Otolaryngology residents. Each participant performed 10 procedures on the simulator and metrics were automatically collected. Senior Otolaryngologists took significantly less time to completion compared to junior residents. Junior residents had 2.8 times more errors as compared to experienced surgeons. The senior surgeons also had significantly longer incision lengths, more accurate incision angles, and lower magnification keeping both the umbo and annulus in view. All metrics were able to discriminate senior Otolaryngologists from junior residents with a significance of p \u3c 0.002. The simulator has sufficient realism, training potential and performance discrimination ability to warrant a more resource intensive skills transference study

    Virtual and Augmented Reality in Medical Education

    Get PDF
    Virtual reality (VR) and augmented reality (AR) are two contemporary simulation models that are currently upgrading medical education. VR provides a 3D and dynamic view of structures and the ability of the user to interact with them. The recent technological advances in haptics, display systems, and motion detection allow the user to have a realistic and interactive experience, enabling VR to be ideal for training in hands-on procedures. Consequently, surgical and other interventional procedures are the main fields of application of VR. AR provides the ability of projecting virtual information and structures over physical objects, thus enhancing or altering the real environment. The integration of AR applications in the understanding of anatomical structures and physiological mechanisms seems to be beneficial. Studies have tried to demonstrate the validity and educational effect of many VR and AR applications, in many different areas, employed via various hardware platforms. Some of them even propose a curriculum that integrates these methods. This chapter provides a brief history of VR and AR in medicine, as well as the principles and standards of their function. Finally, the studies that show the effect of the implementation of these methods in different fields of medical training are summarized and presented

    VIRTUAL REALITY SIMULATION FOR MYRINGOTOMY TRAINING WITH HAPTIC FEEDBACK

    Get PDF
    Myringotomy is a surgical procedure in which an incision is made in the eardrum, primarily to treat middle-ear infections. It is a difficult procedure for surgical residents to master because excellent hand-eye coordination is required to work under a surgical microscope and within the narrow ear canal. We have been developing a virtual-reality based surgical simulator for training residents. The current simulator does not include tactile feedback, but such feedback is a very important part of ear surgery. Therefore, the objectives of this work were to incorporate haptic feedback capability into our simulator, estimate the haptic parameter for the eardrum and perform a face validity study to test the effectiveness of the simulator. The results from the face validity study are very encouraging. The simulator is the first of its kind, and with further refinement has excellent potential to be of benefit in the training of proficient surgical residents

    HAPTIC AND VISUAL SIMULATION OF BONE DISSECTION

    Get PDF
    Marco AgusIn bone dissection virtual simulation, force restitution represents the key to realistically mimicking a patient– specific operating environment. The force is rendered using haptic devices controlled by parametrized mathematical models that represent the bone–burr contact. This dissertation presents and discusses a haptic simulation of a bone cutting burr, that it is being developed as a component of a training system for temporal bone surgery. A physically based model was used to describe the burr– bone interaction, including haptic forces evaluation, bone erosion process and resulting debris. The model was experimentally validated and calibrated by employing a custom experimental set–up consisting of a force–controlled robot arm holding a high–speed rotating tool and a contact force measuring apparatus. Psychophysical testing was also carried out to assess individual reaction to the haptic environment. The results suggest that the simulator is capable of rendering the basic material differences required for bone burring tasks. The current implementation, directly operating on a voxel discretization of patientspecific 3D CT and MR imaging data, is efficient enough to provide real–time haptic and visual feedback on a low–end multi–processing PC platform.

    Real-time hybrid cutting with dynamic fluid visualization for virtual surgery

    Get PDF
    It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery

    Real-time haptic modeling and simulation for prosthetic insertion

    Get PDF
    In this work a surgical simulator is produced which enables a training otologist to conduct a virtual, real-time prosthetic insertion. The simulator provides the Ear, Nose and Throat surgeon with real-time visual and haptic responses during virtual cochlear implantation into a 3D model of the human Scala Tympani (ST). The parametric model is derived from measured data as published in the literature and accounts for human morphological variance, such as differences in cochlear shape, enabling patient-specific pre- operative assessment. Haptic modeling techniques use real physical data and insertion force measurements, to develop a force model which mimics the physical behavior of an implant as it collides with the ST walls during an insertion. Output force profiles are acquired from the insertion studies conducted in the work, to validate the haptic model. The simulator provides the user with real-time, quantitative insertion force information and associated electrode position as user inserts the virtual implant into the ST model. The information provided by this study may also be of use to implant manufacturers for design enhancements as well as for training specialists in optimal force administration, using the simulator. The paper reports on the methods for anatomical modeling and haptic algorithm development, with focus on simulator design, development, optimization and validation. The techniques may be transferrable to other medical applications that involve prosthetic device insertions where user vision is obstructed

    Fully Immersive Virtual Reality for Skull-base Surgery: Surgical Training and Beyond

    Full text link
    Purpose: A virtual reality (VR) system, where surgeons can practice procedures on virtual anatomies, is a scalable and cost-effective alternative to cadaveric training. The fully digitized virtual surgeries can also be used to assess the surgeon's skills using measurements that are otherwise hard to collect in reality. Thus, we present the Fully Immersive Virtual Reality System (FIVRS) for skull-base surgery, which combines surgical simulation software with a high-fidelity hardware setup. Methods: FIVRS allows surgeons to follow normal clinical workflows inside the VR environment. FIVRS uses advanced rendering designs and drilling algorithms for realistic bone ablation. A head-mounted display with ergonomics similar to that of surgical microscopes is used to improve immersiveness. Extensive multi-modal data is recorded for post-analysis, including eye gaze, motion, force, and video of the surgery. A user-friendly interface is also designed to ease the learning curve of using FIVRS. Results: We present results from a user study involving surgeons with various levels of expertise. The preliminary data recorded by FIVRS differentiates between participants with different levels of expertise, promising future research on automatic skill assessment. Furthermore, informal feedback from the study participants about the system's intuitiveness and immersiveness was positive. Conclusion: We present FIVRS, a fully immersive VR system for skull-base surgery. FIVRS features a realistic software simulation coupled with modern hardware for improved realism. The system is completely open-source and provides feature-rich data in an industry-standard format.Comment: IPCAI/IJCARS 202

    The feasibility of virtual reality for anatomic training during temporal bone dissection course

    Get PDF
    Funding Information: The study was funded by the Academy of Finland (AD Grant No. 333525), State Research Funding of the Kuopio University Hospital (TT Grant No. 5551865, AD Grant No. 5551853), The Finnish ORL-HNS Foundation (TT Grant No. 20210002 and No. 20220027), North Savo Regional Fund (TT Grant No. 65202121, AD Grant No. 65202054), Finnish Cultural Foundation (TT Grant No. 00211098), and The Finnish Society of Ear Surgery. Publisher Copyright: Copyright © 2022 Timonen, Iso-Mustajärvi, Linder, Vrzakova, Sinkkonen, Luukkainen, Laitakari, Elomaa and Dietz.Introduction: In recent decades, the lack of educational resources for cadaveric dissections has complicated the hands-on otological surgical training of otorhinolaryngology residents due to the poor availability of cadaver temporal bones, facilities, and limited hours for practice. Since students must gain adequate and patient-safe surgical skills, novel training methods need to be considered. In this proof-of-concept study, a new virtual reality (VR) software is described; this was used during a national temporal bone dissection course where we investigated its feasibility for otological surgical training. Methods: A total of 11 otorhinolaryngology residents attended the annual 2-day hands-on temporal bone dissection course; they were divided into two groups with similar experience levels. Both groups received a lecture on temporal bone anatomy. A total of 22 cadaver temporal bones were harvested for the course; 11 of these bones were imaged by computed tomography. VR software designed for preoperative planning was then used to create 3D models of the imaged temporal bones. Prior to dissection training, the first group underwent a 30-min VR session, where they identified 24 surgically relevant anatomical landmarks on their individual temporal bone. The second group proceeded directly to dissection training. On the second day, the groups were switched. The feasibility of VR training was assessed with three different metrics: surgical performance evaluation using a modified Hopkins objective structured assessment of technical skill (OSATS), time for the surgical exposure of anatomical landmarks, and the user experience collected with a Likert scale questionnaire. Results: No differences were noted in the overall performance between the groups. However, participants with prior VR training had a lower mean time for surgical exposure of anatomical landmarks (antrum 22.09 vs. 27.64 min, p = 0.33; incus 60.00 vs. 76.00, p = 0.03; PSCC 71.83 vs. 88.50, p = 0.17) during dissection training. The participants considered VR beneficial for anatomy teaching, surgery planning, and training. Conclusion: This study demonstrated the feasibility of implementing VR training in a temporal bone dissection course. The VR training demonstrated that even short expert-guided VR sessions are beneficial, and VR training prior to the dissections has a positive effect on the time needed to perform surgical tasks while maintaining comparable performance scores.Peer reviewe

    A Virtual-Based Haptic Endoscopic Sinus Surgery (ESS) Training System: from Development to Validation

    Full text link
    Simulated training platforms offer a suitable avenue for surgical students and professionals to build and improve upon their skills, without the hassle of traditional training methods. To enhance the degree of realistic interaction paradigms of training simulators, great work has been done to both model simulated anatomy in more realistic fashion, as well as providing appropriate haptic feedback to the trainee. As such, this chapter seeks to discuss the ongoing research being conducted on haptic feedback-incorporated simulators specifically for Endoscopic Sinus Surgery (ESS). This chapter offers a brief comparative analysis of some EES simulators, in addition to a deeper quantitative and qualitative look into our approach to designing and prototyping a complete virtual-based haptic EES training platform
    corecore