4 research outputs found

    BTLD+:A BAYESIAN APPROACH TO TRACKING LEARNING DETECTION BY PARTS

    Get PDF
    The contribution proposed in this thesis focuses on this particular instance of the visual tracking problem, referred as Adaptive Ap- iv \ufffcpearance Tracking. We proposed different approaches based on the Tracking Learning Detection (TLD) decomposition proposed in [55]. TLD decomposes visual tracking into three components, namely the tracker, the learner and detector. The tracker and the detector are two competitive processes for target localization based on comple- mentary sources of informations. The former searches for local fea- tures between consecutive frames in order to localize the target; the latter exploits an on-line appearance model to detect confident hy- pothesis over the entire image. The learner selects the final solution among the provided hypothesis. It updates the target appearance model, if necessary, reinitialize the tracker and bootstraps the detec- tor\u2019s appearance model. In particular, we investigated different ap- proaches to enforce the TLD stability. First, we replaced the tracker component with a novel one based on mcmc particle filtering; after- wards, we proposed a robust appearance modeling component able to characterize deformable objects in static images; after all, we inte- grated a modeling component able to integrate local visual features learning into the whole approach, lying to a couple layered represen- tation of the target appearance

    AUGMENTED REALITY AND INTRAOPERATIVE C-ARM CONE-BEAM COMPUTED TOMOGRAPHY FOR IMAGE-GUIDED ROBOTIC SURGERY

    Get PDF
    Minimally-invasive robotic-assisted surgery is a rapidly-growing alternative to traditionally open and laparoscopic procedures; nevertheless, challenges remain. Standard of care derives surgical strategies from preoperative volumetric data (i.e., computed tomography (CT) and magnetic resonance (MR) images) that benefit from the ability of multiple modalities to delineate different anatomical boundaries. However, preoperative images may not reflect a possibly highly deformed perioperative setup or intraoperative deformation. Additionally, in current clinical practice, the correspondence of preoperative plans to the surgical scene is conducted as a mental exercise; thus, the accuracy of this practice is highly dependent on the surgeon’s experience and therefore subject to inconsistencies. In order to address these fundamental limitations in minimally-invasive robotic surgery, this dissertation combines a high-end robotic C-arm imaging system and a modern robotic surgical platform as an integrated intraoperative image-guided system. We performed deformable registration of preoperative plans to a perioperative cone-beam computed tomography (CBCT), acquired after the patient is positioned for intervention. From the registered surgical plans, we overlaid critical information onto the primary intraoperative visual source, the robotic endoscope, by using augmented reality. Guidance afforded by this system not only uses augmented reality to fuse virtual medical information, but also provides tool localization and other dynamic intraoperative updated behavior in order to present enhanced depth feedback and information to the surgeon. These techniques in guided robotic surgery required a streamlined approach to creating intuitive and effective human-machine interferences, especially in visualization. Our software design principles create an inherently information-driven modular architecture incorporating robotics and intraoperative imaging through augmented reality. The system's performance is evaluated using phantoms and preclinical in-vivo experiments for multiple applications, including transoral robotic surgery, robot-assisted thoracic interventions, and cocheostomy for cochlear implantation. The resulting functionality, proposed architecture, and implemented methodologies can be further generalized to other C-arm-based image guidance for additional extensions in robotic surgery

    Re-localisation of microscopic lesions in their macroscopic context for surgical instrument guidance

    Get PDF
    Optical biopsies interrogate microscopic structure in vivo with a 2mm diameter miniprobe placed in contact with the tissue for detection of lesions and assessment of disease progression. After detection, instruments are guided to the lesion location for a new optical interrogation, or for treatment, or for tissue excision during the same or a future examination. As the optical measurement can be considered as a point source of information at the surface of the tissue of interest, accurate guidance can be difficult. A method for re-localisation of the sampling point is, therefore, needed. The method presented in this thesis has been developed for biopsy site re-localisation during a surveillance examination of Barrett’s Oesophagus. The biopsy site, invisible macroscopically during conventional endoscopy, is re-localised in the target endoscopic image using epipolar lines derived from its locations given by the tip of the miniprobe visible in a series of reference endoscopic images. A confidence region can be drawn around the relocalised biopsy site from its uncertainty that is derived analytically. This thesis also presents a method to improve the accuracy of the epipolar lines derived for the biopsy site relocalisation using an electromagnetic tracking system. Simulations and tests on patient data identified the cases when the analytical uncertainty is a good approximation of the confidence region and showed that biopsy sites can be re-localised with accuracies better than 1mm. Studies on phantom and on porcine excised tissue demonstrated that an electromagnetic tracking system contributes to more accurate epipolar lines and re-localised biopsy sites for an endoscope displacement greater than 5mm. The re-localisation method can be applied to images acquired during different endoscopic examinations. It may also be useful for pulmonary applications. Finally, it can be combined with a Magnetic Resonance scanner which can steer cells to the biopsy site for tissue treatment
    corecore