1,195 research outputs found

    Visual servoing-based augmented reality

    Get PDF
    The notion of Augmented Reality (AR) is to mix computer-generated, synthetic elements (3D/2D graphics, 3D audio) with real world in such a way that the synthetic elements appear to be part of the real world. There are various techniques to accomplish this, including magnetic tracking of position and orientation, and video-based tracking. This paper focuses on the video-based AR i

    Augmented Reality Ultrasound Guidance in Anesthesiology

    Get PDF
    Real-time ultrasound has become a mainstay in many image-guided interventions and increasingly popular in several percutaneous procedures in anesthesiology. One of the main constraints of ultrasound-guided needle interventions is identifying and distinguishing the needle tip from needle shaft in the image. Augmented reality (AR) environments have been employed to address challenges surrounding surgical tool visualization, navigation, and positioning in many image-guided interventions. The motivation behind this work was to explore the feasibility and utility of such visualization techniques in anesthesiology to address some of the specific limitations of ultrasound-guided needle interventions. This thesis brings together the goals, guidelines, and best development practices of functional AR ultrasound image guidance (AR-UIG) systems, examines the general structure of such systems suitable for applications in anesthesiology, and provides a series of recommendations for their development. The main components of such systems, including ultrasound calibration and system interface design, as well as applications of AR-UIG systems for quantitative skill assessment, were also examined in this thesis. The effects of ultrasound image reconstruction techniques, as well as phantom material and geometry on ultrasound calibration, were investigated. Ultrasound calibration error was reduced by 10% with synthetic transmit aperture imaging compared with B-mode ultrasound. Phantom properties were shown to have a significant effect on calibration error, which is a variable based on ultrasound beamforming techniques. This finding has the potential to alter how calibration phantoms are designed cognizant of the ultrasound imaging technique. Performance of an AR-UIG guidance system tailored to central line insertions was evaluated in novice and expert user studies. While the system outperformed ultrasound-only guidance with novice users, it did not significantly affect the performance of experienced operators. Although the extensive experience of the users with ultrasound may have affected the results, certain aspects of the AR-UIG system contributed to the lackluster outcomes, which were analyzed via a thorough critique of the design decisions. The application of an AR-UIG system in quantitative skill assessment was investigated, and the first quantitative analysis of needle tip localization error in ultrasound in a simulated central line procedure, performed by experienced operators, is presented. Most participants did not closely follow the needle tip in ultrasound, resulting in 42% unsuccessful needle placements and a 33% complication rate. Compared to successful trials, unsuccessful procedures featured a significantly greater (p=0.04) needle-tip to image-plane distance. Professional experience with ultrasound does not necessarily lead to expert level performance. Along with deliberate practice, quantitative skill assessment may reinforce clinical best practices in ultrasound-guided needle insertions. Based on the development guidelines, an AR-UIG system was developed to address the challenges in ultrasound-guided epidural injections. For improved needle positioning, this system integrated A-mode ultrasound signal obtained from a transducer housed at the tip of the needle. Improved needle navigation was achieved via enhanced visualization of the needle in an AR environment, in which B-mode and A-mode ultrasound data were incorporated. The technical feasibility of the AR-UIG system was evaluated in a preliminary user study. The results suggested that the AR-UIG system has the potential to outperform ultrasound-only guidance

    Virtual and Augmented Reality Techniques for Minimally Invasive Cardiac Interventions: Concept, Design, Evaluation and Pre-clinical Implementation

    Get PDF
    While less invasive techniques have been employed for some procedures, most intracardiac interventions are still performed under cardiopulmonary bypass, on the drained, arrested heart. The progress toward off-pump intracardiac interventions has been hampered by the lack of adequate visualization inside the beating heart. This thesis describes the development, assessment, and pre-clinical implementation of a mixed reality environment that integrates pre-operative imaging and modeling with surgical tracking technologies and real-time ultrasound imaging. The intra-operative echo images are augmented with pre-operative representations of the cardiac anatomy and virtual models of the delivery instruments tracked in real time using magnetic tracking technologies. As a result, the otherwise context-less images can now be interpreted within the anatomical context provided by the anatomical models. The virtual models assist the user with the tool-to-target navigation, while real-time ultrasound ensures accurate positioning of the tool on target, providing the surgeon with sufficient information to ``see\u27\u27 and manipulate instruments in absence of direct vision. Several pre-clinical acute evaluation studies have been conducted in vivo on swine models to assess the feasibility of the proposed environment in a clinical context. Following direct access inside the beating heart using the UCI, the proposed mixed reality environment was used to provide the necessary visualization and navigation to position a prosthetic mitral valve on the the native annulus, or to place a repair patch on a created septal defect in vivo in porcine models. Following further development and seamless integration into the clinical workflow, we hope that the proposed mixed reality guidance environment may become a significant milestone toward enabling minimally invasive therapy on the beating heart

    A Concept for 3D Damage Mapping with Augmented Reality Technologies

    Get PDF

    Evaluating Human Performance for Image-Guided Surgical Tasks

    Get PDF
    The following work focuses on the objective evaluation of human performance for two different interventional tasks; targeted prostate biopsy tasks using a tracked biopsy device, and external ventricular drain placement tasks using a mobile-based augmented reality device for visualization and guidance. In both tasks, a human performance methodology was utilized which respects the trade-off between speed and accuracy for users conducting a series of targeting tasks using each device. This work outlines the development and application of performance evaluation methods using these devices, as well as details regarding the implementation of the mobile AR application. It was determined that the Fitts’ Law methodology can be applied for evaluation of tasks performed in each surgical scenario, and was sensitive to differentiate performance across a range which spanned experienced and novice users. This methodology is valuable for future development of training modules for these and other medical devices, and can provide details about the underlying characteristics of the devices, and how they can be optimized with respect to human performance

    A comparative study of the sense of presence and anxiety in an invisible marker versus a marker Augmented Reality system for the treatment of phobia towards small animals

    Full text link
    Phobia towards small animals has been treated using exposure in vivo and virtual reality. Recently, augmented reality (AR) has also been presented as a suitable tool. The first AR system developed for this purpose used visible markers for tracking. In this first system, the presence of visible markers warns the user of the appearance of animals. To avoid this warning, this paper presents a second version in which the markers are invisible. First, the technical characteristics of a prototype are described. Second, a comparative study of the sense of presence and anxiety in a non-phobic population using the visible marker-tracking system and the invisible marker-tracking system is presented. Twenty-four participants used the two systems. The participants were asked to rate their anxiety level (from 0 to 10) at 8 different moments. Immediately after their experience, the participants were given the SUS questionnaire to assess their subjective sense of presence. The results indicate that the invisible marker-tracking system induces a similar or higher sense of presence than the visible marker-tracking system, and it also provokes a similar or higher level of anxiety in important steps for therapy. Moreover, 83.33% of the participants reported that they did not have the same sensations/surprise using the two systems, and they scored the advantage of using the invisible marker-tracking system (IMARS) at 5.19 +/- 2.25 (on a scale from 1 to 10). However, if only the group with higher fear levels is considered, 100% of the participants reported that they did not have the same sensations/surprise with the two systems, scoring the advantage of using IMARS at 6.38 +/- 1.60 (on a scale from 1 to 10). (C) 2011 Elsevier Ltd. All rights reserved.Juan, M.; Joele, D. (2011). A comparative study of the sense of presence and anxiety in an invisible marker versus a marker Augmented Reality system for the treatment of phobia towards small animals. International Journal of Human-Computer Studies. 69(6):440-453. doi:10.1016/j.ijhcs.2011.03.00244045369
    • …
    corecore