1,265 research outputs found

    The Boston University Photonics Center annual report 2016-2017

    Full text link
    This repository item contains an annual report that summarizes activities of the Boston University Photonics Center in the 2016-2017 academic year. The report provides quantitative and descriptive information regarding photonics programs in education, interdisciplinary research, business innovation, and technology development. The Boston University Photonics Center (BUPC) is an interdisciplinary hub for education, research, scholarship, innovation, and technology development associated with practical uses of light.This has undoubtedly been the Photonics Center’s best year since I became Director 10 years ago. In the following pages, you will see highlights of the Center’s activities in the past year, including more than 100 notable scholarly publications in the leading journals in our field, and the attraction of more than 22 million dollars in new research grants/contracts. Last year I had the honor to lead an international search for the first recipient of the Moustakas Endowed Professorship in Optics and Photonics, in collaboration with ECE Department Chair Clem Karl. This professorship honors the Center’s most impactful scholar and one of the Center’s founding visionaries, Professor Theodore Moustakas. We are delighted to haveawarded this professorship to Professor Ji-Xin Cheng, who joined our faculty this year.The past year also marked the launch of Boston University’s Neurophotonics Center, which will be allied closely with the Photonics Center. Leading that Center will be a distinguished new faculty member, Professor David Boas. David and I are together leading a new Neurophotonics NSF Research Traineeship Program that will provide $3M to promote graduate traineeships in this emerging new field. We had a busy summer hosting NSF Sites for Research Experiences for Undergraduates, Research Experiences for Teachers, and the BU Student Satellite Program. As a community, we emphasized the theme of “Optics of Cancer Imaging” at our annual symposium, hosted by Darren Roblyer. We entered a five-year second phase of NSF funding in our Industry/University Collaborative Research Center on Biophotonic Sensors and Systems, which has become the centerpiece of our translational biophotonics program. That I/UCRC continues to focus on advancing the health care and medical device industries

    Optical imaging for breast cancer prescreening

    Get PDF
    Breast cancer prescreening is carried out prior to the gold standard screening using X-ray mammography and/or ultrasound. Prescreening is typically carried out using clinical breast examination (CBE) or self-breast examinations (SBEs). Since CBE and SBE have high false-positive rates, there is a need for a low-cost, noninvasive, non-radiative, and portable imaging modality that can be used as a prescreening tool to complement CBE/SBE. This review focuses on the various hand-held optical imaging devices that have been developed and applied toward early-stage breast cancer detection or as a prescreening tool via phantom, in vivo, and breast cancer imaging studies. Apart from the various optical devices developed by different research groups, a wide-field fiber-free near-infrared optical scanner has been developed for transillumination-based breast imaging in our Optical Imaging Laboratory. Preliminary in vivo studies on normal breast tissues, with absorption-contrasted targets placed in the intramammary fold, detected targets as deep as 8.8 cm. Future work involves in vivo imaging studies on breast cancer subjects and comparison with the gold standard X-ray mammography approach

    Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping

    Get PDF
    The inability to identify microscopic tumors and assess surgical margins in real-time during oncologic surgery leads to incomplete tumor removal, increases the chances of tumor recurrence, and necessitates costly repeat surgery. To overcome these challenges, we have developed a wearable goggle augmented imaging and navigation system (GAINS) that can provide accurate intraoperative visualization of tumors and sentinel lymph nodes in real-time without disrupting normal surgical workflow. GAINS projects both near-infrared fluorescence from tumors and the natural color images of tissue onto a head-mounted display without latency. Aided by tumor-targeted contrast agents, the system detected tumors in subcutaneous and metastatic mouse models with high accuracy (sensitivity = 100%, specificity = 98% ± 5% standard deviation). Human pilot studies in breast cancer and melanoma patients using a near-infrared dye show that the GAINS detected sentinel lymph nodes with 100% sensitivity. Clinical use of the GAINS to guide tumor resection and sentinel lymph node mapping promises to improve surgical outcomes, reduce rates of repeat surgery, and improve the accuracy of cancer staging

    Bio-Inspired Multi-Spectral Image Sensor and Augmented Reality Display for Near-Infrared Fluorescence Image-Guided Surgery

    Get PDF
    Background: Cancer remains a major public health problem worldwide and poses a huge economic burden. Near-infrared (NIR) fluorescence image-guided surgery (IGS) utilizes molecular markers and imaging instruments to identify and locate tumors during surgical resection. Unfortunately, current state-of-the-art NIR fluorescence imaging systems are bulky, costly, and lack both fluorescence sensitivity under surgical illumination and co-registration accuracy between multimodal images. Additionally, the monitor-based display units are disruptive to the surgical workflow and are suboptimal at indicating the 3-dimensional position of labeled tumors. These major obstacles have prevented the wide acceptance of NIR fluorescence imaging as the standard of care for cancer surgery. The goal of this dissertation is to enhance cancer treatment by developing novel image sensors and presenting the information using holographic augmented reality (AR) display to the physician in intraoperative settings. Method: By mimicking the visual system of the Morpho butterfly, several single-chip, color-NIR fluorescence image sensors and systems were developed with CMOS technologies and pixelated interference filters. Using a holographic AR goggle platform, an NIR fluorescence IGS display system was developed. Optoelectronic evaluation was performed on the prototypes to evaluate the performance of each component, and small animal models and large animal models were used to verify the overall effectiveness of the integrated systems at cancer detection. Result: The single-chip bio-inspired multispectral logarithmic image sensor I developed has better main performance indicators than the state-of-the-art NIR fluorescence imaging instruments. The image sensors achieve up to 140 dB dynamic range. The sensitivity under surgical illumination achieves 6108 V/(mW/cm2), which is up to 25 times higher. The signal-to-noise ratio is up to 56 dB, which is 11 dB greater. These enable high sensitivity fluorescence imaging under surgical illumination. The pixelated interference filters enable temperature-independent co-registration accuracy between multimodal images. Pre-clinical trials with small animal model demonstrate that the sensor can achieve up to 95% sensitivity and 94% specificity with tumor-targeted NIR molecular probes. The holographic AR goggle provides the physician with a non-disruptive 3-dimensional display in the clinical setup. This is the first display system that co-registers a virtual image with human eyes and allows video rate image transmission. The imaging system is tested in the veterinary science operating room on canine patients with naturally occurring cancers. In addition, a time domain pulse-width-modulation address-event-representation multispectral image sensor and a handheld multispectral camera prototype are developed. Conclusion: The major problems of current state-of-the-art NIR fluorescence imaging systems are successfully solved. Due to enhanced performance and user experience, the bio-inspired sensors and augmented reality display system will give medical care providers much needed technology to enable more accurate value-based healthcare

    A Comprehensive Review on Design and Development of Human Breast Phantoms for Ultra-Wide Band Breast Cancer Imaging Systems

    Get PDF
    Microwave ultra-wide band UWB imaging system is a contemporary biomedical imaging technology for early detection of breast cancers. This imaging system requires the development of breast phantoms for experimental data analysis. In order to obtain realistic results, it is very important that these phantoms mimic the characteristics of real biological breast tissue as close as possible. For this purpose, scientists and engineers make use of the dielectric properties of human breast. This paper takes a survey of mathematical formulations used to determine biological dielectric properties and then takes a review of current breast phantoms being used in UWB imaging systems with reference to the analytical dielectric measurements. At present, breast phantoms are made, both, manually in laboratory utilizing different chemicals and also by using computational electromagnetic algorithms to introduce better heterogeneity in them. They can then easily be tested by doing computer simulations. In this review paper, emphasis is made on the phantoms which are made in laboratory for doing hardware experimentations.Microwave ultra-wide band UWB imaging system is a contemporary biomedical imaging technology for early detection of breast cancers. This imaging system requires the development of breast phantoms for experimental data analysis. In order to obtain realistic results, it is very important that these phantoms mimic the characteristics of real biological breast tissue as close as possible. For this purpose, scientists and engineers make use of the dielectric properties of human breast. This paper takes a survey of mathematical formulations used to determine biological dielectric properties and then takes a review of current breast phantoms being used in UWB imaging systems with reference to the analytical dielectric measurements. At present, breast phantoms are made, both, manually in laboratory utilizing different chemicals and also by using computational electromagnetic algorithms to introduce better heterogeneity in them. They can then easily be tested by doing computer simulations. In this review paper, emphasis is made on the phantoms which are made in laboratory for doing hardware experimentations

    Goggle Augmented Imaging and Navigation System for Fluorescence-Guided Surgery

    Get PDF
    Surgery remains the only curative option for most solid tumors. The standard-of-care usually involves tumor resection and sentinel lymph node biopsy for cancer staging. Surgeons rely on their vision and touch to distinguish healthy from cancer tissue during surgery, often leading to incomplete tumor resection that necessitates repeat surgery. Sentinel lymph node biopsy by conventional radioactive tracking exposes patients and caregivers to ionizing radiation, while blue dye tracking stains the tissue highlighting only superficial lymph nodes. Improper identification of sentinel lymph nodes may misdiagnose the stage of the cancer. Therefore there is a clinical need for accurate intraoperative tumor and sentinel lymph node visualization. Conventional imaging modalities such as x-ray computed tomography, positron emission tomography, magnetic resonance imaging, and ultrasound are excellent for preoperative cancer diagnosis and surgical planning. However, they are not suitable for intraoperative use, due to bulky complicated hardware, high cost, non-real-time imaging, severe restrictions to the surgical workflow and lack of sufficient resolution for tumor boundary assessment. This has propelled interest in fluorescence-guided surgery, due to availability of simple hardware that can achieve real-time, high resolution and sensitive imaging. Near-infrared fluorescence imaging is of particular interest due to low background absorbance by photoactive biomolecules, enabling thick tissue assessment. As a result several near-infrared fluorescence-guided surgery systems have been developed. However, they are limited by bulky hardware, disruptive information display and non-matched field of view to the user. To address these limitations we have developed a compact, light-weight and wearable goggle augmented imaging and navigation system (GAINS). It detects the near-infrared fluorescence from a tumor accumulated contrast agent, along with the normal color view and displays accurately aligned, color-fluorescence images via a head-mounted display worn by the surgeon, in real-time. GAINS is a platform technology and capable of very sensitive fluorescence detection. Image display options include both video see-through and optical see-through head-mounted displays for high-contrast image guidance as well as direct visual access to the surgical bed. Image capture options from large field of view camera as well high magnification handheld microscope, ensures macroscopic as well as microscopic assessment of the tumor bed. Aided by tumor targeted near-infrared contrast agents, GAINS guided complete tumor resection in subcutaneous, metastatic and spontaneous mouse models of cancer with high sensitivity and specificity, in real-time. Using a clinically-approved near-infrared contrast agent, GAINS provided real-time image guidance for accurate visualization of lymph nodes in a porcine model and sentinel lymph nodes in human breast cancer and melanoma patients with high sensitivity. This work has addressed issues that have limited clinical adoption of fluorescence-guided surgery and paved the way for research into developing this approach towards standard-of-care practice that can potentially improve surgical outcomes in cancer

    Innovative Device for Indocianyne Green Navigational Surgery

    Get PDF
    Dynamic reality has been integrated into developing surgical techniques, with the goals of providing increased intraoperative accuracy, easier detection of critical anatomical landmarks, and better general results for the patient. Enhancement of the reality in surgical theaters using single or multi sensorial augmenters (haptic, thermic and visual) has been reported with various degrees of success. This paper presents a novel device for navigational surgery and ancillary clinical applications based on the fluorescent properties of Indocyanine Green (ICG), a safe, FDA-approved dye that emits fluorescence at higher wavelengths than endogenous proteins. The latest technological developments and the aforementioned convenient quantum behavior of ICG allow for its effective identification in tissues by means of a complementary metal-oxide semiconductor (CMOS) infrared camera. Following fundamental research on the fluorophor in different biological suspensions and at various concentrations, our team has built a device that casts a beam of excitation light at 780nm and collects emission light at 810-830nm, filtering ambient light and endogenous autofluorescence. The emission light is fluorescent and infrared, unlike visible light. It can penetrate tissues up to 1.6cm in depth, providing after digitization into conventional imaging anatomical and functional data of immense intra-operative value

    Fluorescence-guided surgical system using holographic display: From phantom studies to canine patients

    Get PDF
    SIGNIFICANCE: Holographic display technology is a promising area of research that can lead to significant advancements in cancer surgery. We present the benefits of combining bioinspired multispectral imaging technology with holographic goggles for fluorescence-guided cancer surgery. Through a series of experiments with 43D-printed phantoms, small animal models of cancer, and surgeries on canine patients with head and neck cancer, we showcase the advantages of this holistic approach. AIM: The aim of our study is to demonstrate the feasibility and potential benefits of utilizing holographic display for fluorescence-guided surgery through a series of experiments involving 3D-printed phantoms and canine patients with head and neck cancer. APPROACH: We explore the integration of a bioinspired camera with a mixed reality headset to project fluorescent images as holograms onto a see-through display, and we demonstrate the potential benefits of this technology through benchtop and RESULTS: Our complete imaging and holographic display system showcased improved delineation of fluorescent targets in phantoms compared with the 2D monitor display approach and easy integration into the veterinarian surgical workflow. CONCLUSIONS: Based on our findings, it is evident that our comprehensive approach, which combines a bioinspired multispectral imaging sensor with holographic goggles, holds promise in enhancing the presentation of fluorescent information to surgeons during intraoperative scenarios while minimizing disruptions
    corecore