619 research outputs found

    Evaluating Human Performance for Image-Guided Surgical Tasks

    Get PDF
    The following work focuses on the objective evaluation of human performance for two different interventional tasks; targeted prostate biopsy tasks using a tracked biopsy device, and external ventricular drain placement tasks using a mobile-based augmented reality device for visualization and guidance. In both tasks, a human performance methodology was utilized which respects the trade-off between speed and accuracy for users conducting a series of targeting tasks using each device. This work outlines the development and application of performance evaluation methods using these devices, as well as details regarding the implementation of the mobile AR application. It was determined that the Fitts’ Law methodology can be applied for evaluation of tasks performed in each surgical scenario, and was sensitive to differentiate performance across a range which spanned experienced and novice users. This methodology is valuable for future development of training modules for these and other medical devices, and can provide details about the underlying characteristics of the devices, and how they can be optimized with respect to human performance

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe

    Guiding Vascular Access with the Sonic Flashlight - Preclinical Development and Validation

    Get PDF
    This dissertation concerns the development of a device called the Sonic Flashlight, which employs a novel method for viewing real-time ultrasound images inside the body exactly at the location where it is being scanned. While other augmented reality methods have previously been developed to view ultrasound and other medical imaging modalities within the body, they are generally much more complicated, slower and less robust than the Sonic Flashlight.In this dissertation, we aim to develop the Sonic Flashlight towards one particular clinical application, central vascular access, and lay the groundwork leading to the first clinical trials. The goal of central vascular access is to insert a catheter into a major vein to deliver medications in large quantities. These veins are usually not visible to the naked eye, so real-time ultrasound is employed to guide the needle into them. While real-time ultrasound guidance significantly enhances the safety of central venous access, learning this skill can be a challenge for the novice user, one major obstacle being the displaced sense of hand-eye coordination that occurs when the operator must look away from the operating field to view the conventional ultrasound monitor.We developed the 5th generation Sonic Flashlight, as well as a novel calibration method, called thin-gel calibration, as part of this dissertation. The thin-gel system allows us to accurately calibrate the Sonic Flashlight and measure the calibration accuracy. Finally, experiments were conducted with a variety of subject populations using vascular ultrasound phantoms and cadavers to validate Sonic Flashlight guidance, demonstrating that the device is ready for clinical trials

    The role of camera convergence in stereoscopic video see-through augmented reality displays

    Get PDF
    In the realm of wearable augmented reality (AR) systems, stereoscopic video see-through displays raise issues related to the user's perception of the three-dimensional space. This paper seeks to put forward few considerations regarding the perceptual artefacts common to standard stereoscopic video see-through displays with fixed camera convergence. Among the possible perceptual artefacts, the most significant one relates to diplopia arising from reduced stereo overlaps and too large screen disparities. Two state-of-the-art solutions are reviewed. The first one suggests a dynamic change, via software, of the virtual camera convergence, whereas the second one suggests a matched hardware/software solution based on a series of predefined focus/vergence configurations. Potentialities and limits of both the solutions are outlined so as to provide the AR community, a yardstick for developing new stereoscopic video see-through systems suitable for different working distances

    Mixed-Reality Simulation of Minimally Invasive Surgeries

    Get PDF
    Our mixed-reality platform helps train surgeons in minimally invasive surgery and objectively assesses their performance. The platform uses multicamera stereo inside a patient manikin to measure the 3D positions of unmodified surgical instruments. It uses this information to drive a mixed-reality, computer-mediated learning system and provide objective measures of surgical skill

    Application of virtual reality, augmented reality, and mixed reality in endourology and urolithiasis: An update by YAU endourology and Urolithiasis Working Group

    Get PDF
    The integration of virtual reality (VR), augmented reality (AR), and mixed reality (MR) in urological practices and medical education has led to modern training systems that are cost-effective and with an increased expectation toward surgical performance and outcomes. VR aids the user in interacting with the virtual environment realistically by providing a three-dimensional (3D) view of the structures inside the body with high-level precision. AR enhances the real environment around users by integrating experience with virtual information over physical models and objects, which in turn has improved understanding of physiological mechanisms and anatomical structures. MR is an immersive technology that provides virtual content to interact with real elements. The field of urolithiasis has adapted the technological advancements, newer instruments, and methods to perform endourologic treatment procedures. This mini-review discusses the applications of Virtual Reality, Augmented Reality, and Mixed Reality in endourology and urolithiasis.publishedVersio

    Optical versus video see-through mead-mounted displays in medical visualization

    Get PDF
    We compare two technological approaches to augmented reality for 3-D medical visualization: optical and video see-through devices. We provide a context to discuss the technology by reviewing several medical applications of augmented-reality research efforts driven by real needs in the medical field, both in the United States and in Europe. We then discuss the issues for each approach, optical versus video, from both a technology and human-factor point of view. Finally, we point to potentially promising future developments of such devices including eye tracking and multifocus planes capabilities, as well as hybrid optical/video technology

    Augmented Reality Ultrasound Guidance in Anesthesiology

    Get PDF
    Real-time ultrasound has become a mainstay in many image-guided interventions and increasingly popular in several percutaneous procedures in anesthesiology. One of the main constraints of ultrasound-guided needle interventions is identifying and distinguishing the needle tip from needle shaft in the image. Augmented reality (AR) environments have been employed to address challenges surrounding surgical tool visualization, navigation, and positioning in many image-guided interventions. The motivation behind this work was to explore the feasibility and utility of such visualization techniques in anesthesiology to address some of the specific limitations of ultrasound-guided needle interventions. This thesis brings together the goals, guidelines, and best development practices of functional AR ultrasound image guidance (AR-UIG) systems, examines the general structure of such systems suitable for applications in anesthesiology, and provides a series of recommendations for their development. The main components of such systems, including ultrasound calibration and system interface design, as well as applications of AR-UIG systems for quantitative skill assessment, were also examined in this thesis. The effects of ultrasound image reconstruction techniques, as well as phantom material and geometry on ultrasound calibration, were investigated. Ultrasound calibration error was reduced by 10% with synthetic transmit aperture imaging compared with B-mode ultrasound. Phantom properties were shown to have a significant effect on calibration error, which is a variable based on ultrasound beamforming techniques. This finding has the potential to alter how calibration phantoms are designed cognizant of the ultrasound imaging technique. Performance of an AR-UIG guidance system tailored to central line insertions was evaluated in novice and expert user studies. While the system outperformed ultrasound-only guidance with novice users, it did not significantly affect the performance of experienced operators. Although the extensive experience of the users with ultrasound may have affected the results, certain aspects of the AR-UIG system contributed to the lackluster outcomes, which were analyzed via a thorough critique of the design decisions. The application of an AR-UIG system in quantitative skill assessment was investigated, and the first quantitative analysis of needle tip localization error in ultrasound in a simulated central line procedure, performed by experienced operators, is presented. Most participants did not closely follow the needle tip in ultrasound, resulting in 42% unsuccessful needle placements and a 33% complication rate. Compared to successful trials, unsuccessful procedures featured a significantly greater (p=0.04) needle-tip to image-plane distance. Professional experience with ultrasound does not necessarily lead to expert level performance. Along with deliberate practice, quantitative skill assessment may reinforce clinical best practices in ultrasound-guided needle insertions. Based on the development guidelines, an AR-UIG system was developed to address the challenges in ultrasound-guided epidural injections. For improved needle positioning, this system integrated A-mode ultrasound signal obtained from a transducer housed at the tip of the needle. Improved needle navigation was achieved via enhanced visualization of the needle in an AR environment, in which B-mode and A-mode ultrasound data were incorporated. The technical feasibility of the AR-UIG system was evaluated in a preliminary user study. The results suggested that the AR-UIG system has the potential to outperform ultrasound-only guidance

    Virtual and Augmented Reality in Medical Education

    Get PDF
    Virtual reality (VR) and augmented reality (AR) are two contemporary simulation models that are currently upgrading medical education. VR provides a 3D and dynamic view of structures and the ability of the user to interact with them. The recent technological advances in haptics, display systems, and motion detection allow the user to have a realistic and interactive experience, enabling VR to be ideal for training in hands-on procedures. Consequently, surgical and other interventional procedures are the main fields of application of VR. AR provides the ability of projecting virtual information and structures over physical objects, thus enhancing or altering the real environment. The integration of AR applications in the understanding of anatomical structures and physiological mechanisms seems to be beneficial. Studies have tried to demonstrate the validity and educational effect of many VR and AR applications, in many different areas, employed via various hardware platforms. Some of them even propose a curriculum that integrates these methods. This chapter provides a brief history of VR and AR in medicine, as well as the principles and standards of their function. Finally, the studies that show the effect of the implementation of these methods in different fields of medical training are summarized and presented
    • …
    corecore