6,556 research outputs found

    EVD Surgical Guidance with Retro-Reflective Tool Tracking and Spatial Reconstruction using Head-Mounted Augmented Reality Device

    Full text link
    Augmented Reality (AR) has been used to facilitate surgical guidance during External Ventricular Drain (EVD) surgery, reducing the risks of misplacement in manual operations. During this procedure, the pivotal challenge is the accurate estimation of spatial relationship between pre-operative images and actual patient anatomy in AR environment. In this research, we propose a novel framework utilizing Time of Flight (ToF) depth sensors integrated in commercially available AR Head Mounted Devices (HMD) for precise EVD surgical guidance. As previous studies have proven depth errors for ToF sensors, we first conducted a comprehensive assessment for the properties of this error on AR-HMDs. Subsequently, a depth error model and patient-specific model parameter identification method, is introduced for accurate surface information. After that, a tracking procedure combining retro-reflective markers and point clouds is proposed for accurate head tracking, where head surface is reconstructed using ToF sensor data for spatial registration, avoiding fixing tracking targets rigidly on the patient's cranium. Firstly, 7.580±1.488mm7.580\pm 1.488 mm ToF sensor depth value error was revealed on human skin, indicating the significance of depth correction. Our results showed that the ToF sensor depth error was reduced by over 85%85\% using proposed depth correction method on head phantoms in different materials. Meanwhile, the head surface reconstructed with corrected depth data achieved sub-millimeter accuracy. Experiment on a sheep head revealed 0.79mm0.79 mm reconstruction error. Furthermore, a user study was conducted for the performance of proposed framework in simulated EVD surgery, where 5 surgeons performed 9 k-wire injections on a head phantom with virtual guidance. Results of this study revealed 2.09±0.16mm2.09 \pm 0.16 mm translational accuracy and 2.97±0.91∘2.97\pm 0.91 ^\circ orientational accuracy

    Spatial Augmented Reality Using Structured Light Illumination

    Get PDF
    Spatial augmented reality is a particular kind of augmented reality technique that uses projector to blend the real objects with virtual contents. Coincidentally, as a means of 3D shape measurement, structured light illumination makes use of projector as part of its system as well. It uses the projector to generate important clues to establish the correspondence between the 2D image coordinate system and the 3D world coordinate system. So it is appealing to build a system that can carry out the functionalities of both spatial augmented reality and structured light illumination. In this dissertation, we present all the hardware platforms we developed and their related applications in spatial augmented reality and structured light illumination. Firstly, it is a dual-projector structured light 3D scanning system that has two synchronized projectors operate simultaneously, consequently it outperforms the traditional structured light 3D scanning system which only include one projector in terms of the quality of 3D reconstructions. Secondly, we introduce a modified dual-projector structured light 3D scanning system aiming at detecting and solving the multi-path interference. Thirdly, we propose an augmented reality face paint system which detects human face in a scene and paints the face with any favorite colors by projection. Additionally, the system incorporates a second camera to realize the 3D space position tracking by exploiting the principle of structured light illumination. At last, a structured light 3D scanning system with its own built-in machine vision camera is presented as the future work. So far the standalone camera has been completed from the a bare CMOS sensor. With this customized camera, we can achieve high dynamic range imaging and better synchronization between the camera and projector. But the full-blown system that includes HDMI transmitter, structured light pattern generator and synchronization logic has yet to be done due to the lack of a well designed high speed PCB

    Bio-Inspired Multi-Spectral Image Sensor and Augmented Reality Display for Near-Infrared Fluorescence Image-Guided Surgery

    Get PDF
    Background: Cancer remains a major public health problem worldwide and poses a huge economic burden. Near-infrared (NIR) fluorescence image-guided surgery (IGS) utilizes molecular markers and imaging instruments to identify and locate tumors during surgical resection. Unfortunately, current state-of-the-art NIR fluorescence imaging systems are bulky, costly, and lack both fluorescence sensitivity under surgical illumination and co-registration accuracy between multimodal images. Additionally, the monitor-based display units are disruptive to the surgical workflow and are suboptimal at indicating the 3-dimensional position of labeled tumors. These major obstacles have prevented the wide acceptance of NIR fluorescence imaging as the standard of care for cancer surgery. The goal of this dissertation is to enhance cancer treatment by developing novel image sensors and presenting the information using holographic augmented reality (AR) display to the physician in intraoperative settings. Method: By mimicking the visual system of the Morpho butterfly, several single-chip, color-NIR fluorescence image sensors and systems were developed with CMOS technologies and pixelated interference filters. Using a holographic AR goggle platform, an NIR fluorescence IGS display system was developed. Optoelectronic evaluation was performed on the prototypes to evaluate the performance of each component, and small animal models and large animal models were used to verify the overall effectiveness of the integrated systems at cancer detection. Result: The single-chip bio-inspired multispectral logarithmic image sensor I developed has better main performance indicators than the state-of-the-art NIR fluorescence imaging instruments. The image sensors achieve up to 140 dB dynamic range. The sensitivity under surgical illumination achieves 6108 V/(mW/cm2), which is up to 25 times higher. The signal-to-noise ratio is up to 56 dB, which is 11 dB greater. These enable high sensitivity fluorescence imaging under surgical illumination. The pixelated interference filters enable temperature-independent co-registration accuracy between multimodal images. Pre-clinical trials with small animal model demonstrate that the sensor can achieve up to 95% sensitivity and 94% specificity with tumor-targeted NIR molecular probes. The holographic AR goggle provides the physician with a non-disruptive 3-dimensional display in the clinical setup. This is the first display system that co-registers a virtual image with human eyes and allows video rate image transmission. The imaging system is tested in the veterinary science operating room on canine patients with naturally occurring cancers. In addition, a time domain pulse-width-modulation address-event-representation multispectral image sensor and a handheld multispectral camera prototype are developed. Conclusion: The major problems of current state-of-the-art NIR fluorescence imaging systems are successfully solved. Due to enhanced performance and user experience, the bio-inspired sensors and augmented reality display system will give medical care providers much needed technology to enable more accurate value-based healthcare
    • …
    corecore