351 research outputs found

    DYNAMIC MEASUREMENT OF THREE-DIMENSIONAL MOTION FROM SINGLE-PERSPECTIVE TWO-DIMENSIONAL RADIOGRAPHIC PROJECTIONS

    Get PDF
    The digital evolution of the x-ray imaging modality has spurred the development of numerous clinical and research tools. This work focuses on the design, development, and validation of dynamic radiographic imaging and registration techniques to address two distinct medical applications: tracking during image-guided interventions, and the measurement of musculoskeletal joint kinematics. Fluoroscopy is widely employed to provide intra-procedural image-guidance. However, its planar images provide limited information about the location of surgical tools and targets in three-dimensional space. To address this limitation, registration techniques, which extract three-dimensional tracking and image-guidance information from planar images, were developed and validated in vitro. The ability to accurately measure joint kinematics in vivo is an important tool in studying both normal joint function and pathologies associated with injury and disease, however it still remains a clinical challenge. A technique to measure joint kinematics from single-perspective x-ray projections was developed and validated in vitro, using clinically available radiography equipmen

    AUGMENTED REALITY AND INTRAOPERATIVE C-ARM CONE-BEAM COMPUTED TOMOGRAPHY FOR IMAGE-GUIDED ROBOTIC SURGERY

    Get PDF
    Minimally-invasive robotic-assisted surgery is a rapidly-growing alternative to traditionally open and laparoscopic procedures; nevertheless, challenges remain. Standard of care derives surgical strategies from preoperative volumetric data (i.e., computed tomography (CT) and magnetic resonance (MR) images) that benefit from the ability of multiple modalities to delineate different anatomical boundaries. However, preoperative images may not reflect a possibly highly deformed perioperative setup or intraoperative deformation. Additionally, in current clinical practice, the correspondence of preoperative plans to the surgical scene is conducted as a mental exercise; thus, the accuracy of this practice is highly dependent on the surgeon’s experience and therefore subject to inconsistencies. In order to address these fundamental limitations in minimally-invasive robotic surgery, this dissertation combines a high-end robotic C-arm imaging system and a modern robotic surgical platform as an integrated intraoperative image-guided system. We performed deformable registration of preoperative plans to a perioperative cone-beam computed tomography (CBCT), acquired after the patient is positioned for intervention. From the registered surgical plans, we overlaid critical information onto the primary intraoperative visual source, the robotic endoscope, by using augmented reality. Guidance afforded by this system not only uses augmented reality to fuse virtual medical information, but also provides tool localization and other dynamic intraoperative updated behavior in order to present enhanced depth feedback and information to the surgeon. These techniques in guided robotic surgery required a streamlined approach to creating intuitive and effective human-machine interferences, especially in visualization. Our software design principles create an inherently information-driven modular architecture incorporating robotics and intraoperative imaging through augmented reality. The system's performance is evaluated using phantoms and preclinical in-vivo experiments for multiple applications, including transoral robotic surgery, robot-assisted thoracic interventions, and cocheostomy for cochlear implantation. The resulting functionality, proposed architecture, and implemented methodologies can be further generalized to other C-arm-based image guidance for additional extensions in robotic surgery

    Advancements and Breakthroughs in Ultrasound Imaging

    Get PDF
    Ultrasonic imaging is a powerful diagnostic tool available to medical practitioners, engineers and researchers today. Due to the relative safety, and the non-invasive nature, ultrasonic imaging has become one of the most rapidly advancing technologies. These rapid advances are directly related to the parallel advancements in electronics, computing, and transducer technology together with sophisticated signal processing techniques. This book focuses on state of the art developments in ultrasonic imaging applications and underlying technologies presented by leading practitioners and researchers from many parts of the world

    Navigating in Patient Space Using Camera Pose Estimation Relative to the External Anatomy

    Get PDF
    Ultrasound probe localization is essential for volumetric imaging with a 2D ultrasound probe, and for establishing a recorded anatomical context for ultrasound-guided surgery and for longitudinal studies. The existing techniques for probe localization, however, require external tracking devices, making them inconvenient for clinical use. In addition, the probe pose is typically measured with respect to a fixed coordinate system independent of the patient’s anatomy, making it difficult to correlate ultrasound studies across time. This dissertation concerns the development and evaluation of a novel self-contained ultrasound probe tracking system, which navigates the probe in patient space using camera pose estimation relative to the anatomical context. As the probe moves in patient space, a video camera on the probe is used to automatically identify natural skin features and subdermal cues, and match them with a pre-acquiring high-resolution 3D surface map that serves as an atlas of the anatomy. We have addressed the problem of distinguishing rotation from translation by including an inertial navigation system (INS) to accurately measure rotation. Experiments on both a phantom containing an image of human skin (palm) as well as actual human upper extremity (fingers, palm, and wrist) validate the effectiveness of our approach. We have also developed a real-time 3D interactive visualization system that superimposes the ultrasound data within the anatomical context of the exterior of the patient, to permit accurate anatomic localization of ultrasound data. The combination of the proposed tracking approach and the visualization system may have broad implications for ultrasound imaging, permitting the compilation of volumetric ultrasound data as the 2D probe is moved, as well as comparison of real-time ultrasound scans registered with previous scans from the same anatomical location. In a broader sense, tools that self-locate by viewing the patient’s exterior could have broad beneficial impact on clinical medicine

    Light-driven micro-robotics for contemporary biophotonics.

    Get PDF

    Ανάπτυξη τεχνολογιών επαυξημένης πραγματικότητας στην ιατρική εκπαίδευση με προσομοιωτές

    Get PDF
    Στην παρούσα διδακτορική διατριβή παρουσιάζουμε ένα πρωτοπόρο σύστημα εκπαίδευσης και αξιολόγησης βασικών δεξιοτήτων λαπαροσκοπικής χειρουργικής σε περιβάλλον Επαυξημένης Πραγματικότητας (ΕΠ). Το προτεινόμενο σύστημα αποτελεί μια πλήρως λειτουργική πλατφόρμα εκπαίδευσης η οποία επιτρέπει σε χειρουργούς να εξασκηθούν χρησιμοποιώντας πραγματικά λαπαροσκοπικά εργαλεία και αλληλεπιδρώντας με ψηφιακά αντικείμενα εντός ενός πραγματικού περιβάλλοντος εκπαίδευσης. Το σύστημα αποτελείται από ένα τυπικό κουτί λαπαροσκοπικής εκπαίδευσης, πραγματικά χειρουργικά εργαλεία, κάμερα και συστοιχία αισθητήρων που επιτρέπουν την ανίχνευση και καταγραφή των κινήσεων του χειρουργού σε πραγματικό χρόνο. Χρησιμοποιώντας το προτεινόμενο σύστημα, σχεδιάσαμε και υλοποιήσαμε σενάρια εκπαίδευσης παρόμοια με τις ασκήσεις του προγράμματος FLS®, στοχεύοντας σε δεξιότητες όπως η αίσθηση βάθους, ο συντονισμός χεριού-ματιού, και η παράλληλη χρήση δύο χεριών. Επιπλέον των βασικών δεξιοτήτων, το προτεινόμενο σύστημα χρησιμοποιήθηκε για τον σχεδιασμό σεναρίου εξάσκησης διαδικαστικών δεξιοτήτων, οι οποίες περιλάμβανουν την εφαρμογή χειρουργικών clips καθώς και την απολίνωση εικονικής αρτηρίας, σε περιβάλλον ΕΠ. Τα αποτελέσματα συγκριτικών μελετών μεταξύ έμπειρων και αρχαρίων χειρουργών που πραγματοποιήθηκαν στα πλαίσια της παρούσας διατριβής υποδηλώνουν την εγκυρότητα του προτεινόμενου συστήματος. Επιπλέον, εξήχθησαν σημαντικά συμπεράσματα σχετικά με την πιθανή χρήση της ΕΑ στην λαπαροσκοπική προσομοίωση. Η συγκεκριμένη τεχνολογία προσφέρει αυξημένη αίσθηση οπτικού ρεαλισμού και ευελιξία στον σχεδιασμό εκπαιδευτικών σεναρίων, παρουσιάζοντας σημαντικά μικρότερες απαιτήσεις από πλευράς εξοπλισμού σε σύγκριση με τις υπάρχουσες εμπορικές πλατφόρμες. Βάσει των αποτελεσμάτων της παρούσας διατριβής μπορεί με ασφάλεια να εξαχθεί το συμπέρασμα πως η ΕΠ αποτελεί μια πολλά υποσχόμενη τεχνολογία που θα μπορούσε να χρησιμοποιηθεί για τον σχεδιασμό προσομοιωτών λαπαροσκοπικής χειρουργικής ως εναλλακτική των υπαρχόντων τεχνολογιών και συστημάτων.In this thesis we present what is, to the best of our knowledge, the first framework for training and assessment of fundamental psychomotor and procedural laparoscopic skills in an interactive Augmented Reality (AR) environment. The proposed system is a fully-featured laparoscopic training platform, allowing surgeons to practice by manipulating real instruments while interacting with virtual objects within a real environment. It consists of a standard laparoscopic box-trainer, real instruments, a camera and a set of sensory devices for real-time tracking of surgeons’ actions. The proposed framework has been used for the implementation of AR-based training scenarios similar to the drills of the FLS® program, focusing on fundamental laparoscopic skills such as depth-perception, hand-eye coordination and bimanual operation. Moreover, this framework allowed the implementation of a proof-of-concept procedural skills training scenario, which involved clipping and cutting of a virtual artery within an AR environment. Comparison studies conducted for the evaluation of the presented framework indicated high content and face validity. In addition, significant conclusions regarding the potentials of introducing AR in laparoscopic simulation training and assessment were drawn. This technology provides an advanced sense of visual realism combined with a great flexibility in training task prototyping, with minimum requirements in terms of hardware as compared to commercially available platforms. Thereby, it can be safely stated that AR is a promising technology which can indeed provide a valuable alternative to the training modalities currently used in MIS
    corecore