191 research outputs found

    On the use of deep learning for phase recovery

    Full text link
    Phase recovery (PR) refers to calculating the phase of the light field from its intensity measurements. As exemplified from quantitative phase imaging and coherent diffraction imaging to adaptive optics, PR is essential for reconstructing the refractive index distribution or topography of an object and correcting the aberration of an imaging system. In recent years, deep learning (DL), often implemented through deep neural networks, has provided unprecedented support for computational imaging, leading to more efficient solutions for various PR problems. In this review, we first briefly introduce conventional methods for PR. Then, we review how DL provides support for PR from the following three stages, namely, pre-processing, in-processing, and post-processing. We also review how DL is used in phase image processing. Finally, we summarize the work in DL for PR and outlook on how to better use DL to improve the reliability and efficiency in PR. Furthermore, we present a live-updating resource (https://github.com/kqwang/phase-recovery) for readers to learn more about PR.Comment: 82 pages, 32 figure

    Topics in Adaptive Optics

    Get PDF
    Advances in adaptive optics technology and applications move forward at a rapid pace. The basic idea of wavefront compensation in real-time has been around since the mid 1970s. The first widely used application of adaptive optics was for compensating atmospheric turbulence effects in astronomical imaging and laser beam propagation. While some topics have been researched and reported for years, even decades, new applications and advances in the supporting technologies occur almost daily. This book brings together 11 original chapters related to adaptive optics, written by an international group of invited authors. Topics include atmospheric turbulence characterization, astronomy with large telescopes, image post-processing, high power laser distortion compensation, adaptive optics and the human eye, wavefront sensors, and deformable mirrors

    Adaptive Optics Progress

    Get PDF
    For over four decades there has been continuous progress in adaptive optics technology, theory, and systems development. Recently there also has been an explosion of applications of adaptive optics throughout the fields of communications and medicine in addition to its original uses in astronomy and beam propagation. This volume is a compilation of research and tutorials from a variety of international authors with expertise in theory, engineering, and technology. Eight chapters include discussion of retinal imaging, solar astronomy, wavefront-sensorless adaptive optics systems, liquid crystal wavefront correctors, membrane deformable mirrors, digital adaptive optics, optical vortices, and coupled anisoplanatism

    Biometric measurements in the crystalline lens: applications in cataract surgery

    Get PDF
    En esta tesis, hemos desarrollado una nueva metodología para medir el desalineamiento de la IOL implantada en pacientes con cirugía de cataratas a partir de imágenes de OCT en-face del segmento anterior. También hemos cuantificado la forma y la topografía 3D del cristalino del ojo in vivo, sus propiedades ópticas y sus cambios con la anterior y posterior del cristalino, y cambios estructurales con la edad. La metodología de OCT cuantitativa y modelos de ojo personalizados en pacientes han sido validados en pacientes operados de cirugía de cataratas, mediante comparaciones de las aberraciones simuladas y medidas en los mismo pacientes, y han permitido comprender la contribución relativa de los factores ópticos geométricos y quirúrgicos relacionados con la calidad de la imagen, como la identificación del centrado óptimo de la IOL. Estos son clave en los cálculos avanzados de la potencia de la IOL, la optimización de la selección individual o diseño de IOLs personalizados que puedan proporcionar una solución visual óptima al paciente.Departamento de Cirugía, Oftalmología, Otorrinolaringología y FisioterapiaDoctorado en Ciencias de la Visió

    Computational Imaging for Phase Retrieval and Biomedical Applications

    Get PDF
    In conventional imaging, optimizing hardware is prioritized to enhance image quality directly. Digital signal processing is viewed as supplementary. Computational imaging intentionally distorts images through modulation schemes in illumination or sensing. Then its reconstruction algorithms extract desired object information from raw data afterwards. Co-designing hardware and algorithms reduces demands on hardware and achieves the same or even better image quality. Algorithm design is at the heart of computational imaging, with model-based inverse problem or data-driven deep learning methods as approaches. This thesis presents research work from both perspectives, with a primary focus on the phase retrieval issue in computational microscopy and the application of deep learning techniques to address biomedical imaging challenges. The first half of the thesis begins with Fourier ptychography, which was employed to overcome chromatic aberration problems in multispectral imaging. Then, we proposed a novel computational coherent imaging modality based on Kramers-Kronig relations, aiming to replace Fourier ptychography as a non-iterative method. While this approach showed promise, it lacks certain essential characteristics of the original Fourier ptychography. To address this limitation, we introduced two additional algorithms to form a whole package scheme. Through comprehensive evaluation, we demonstrated that the combined scheme outperforms Fourier ptychography in achieving high-resolution, large field-of-view, aberration-free coherent imaging. The second half of the thesis shifts focus to deep-learning-based methods. In one project, we optimized the scanning strategy and image processing pipeline of an epifluorescence microscope to address focus issues. Additionally, we leveraged deep-learning-based object detection models to automate cell analysis tasks. In another project, we predicted the polarity status of mouse embryos from bright field images using adapted deep learning models. These findings highlight the capability of computational imaging to automate labor-intensive processes, and even outperform humans in challenging tasks.</p

    Explainable Artificial Intelligence for Image Segmentation and for Estimation of Optical Aberrations

    Get PDF
    State-of-the-art machine learning methods such as convolutional neural networks (CNNs) are frequently employed in computer vision. Despite their high performance on unseen data, CNNs are often criticized for lacking transparency — that is, providing very limited if any information about the internal decision-making process. In some applications, especially in healthcare, such transparency of algorithms is crucial for end users, as trust in diagnosis and prognosis is important not only for the satisfaction and potential adherence of patients, but also for their health. Explainable artificial intelligence (XAI) aims to open up this “black box,” often perceived as a cryptic and inconceivable algorithm, to increase understanding of the machines’ reasoning.XAI is an emerging field, and techniques for making machine learning explainable are becoming increasingly available. XAI for computer vision mainly focuses on image classification, whereas interpretability in other tasks remains challenging. Here, I examine explainability in computer vision beyond image classification, namely in semantic segmentation and 3D multitarget image regression. This thesis consists of five chapters. In Chapter 1 (Introduction), the background of artificial intelligence (AI), XAI, computer vision, and optics is presented, and the definitions of the terminology for XAI are proposed. Chapter 2 is focused on explaining the predictions of U-Net, a CNN commonly used for semantic image segmentation, and variations of this architecture. To this end, I propose the gradient-weighted class activation mapping for segmentation (Seg-Grad-CAM) method based on the well-known Grad-CAM method for explainable image classification. In Chapter 3, I present the application of deep learning to estimation of optical aberrations in microscopy biodata by identifying the present Zernike aberration modes and their amplitudes. A CNN-based approach PhaseNet can accurately estimate monochromatic aberrations in images of point light sources. I extend this method to objects of complex shapes. In Chapter 4, an approach for explainable 3D multitarget image regression is reported. First, I visualize how the model differentiates the aberration modes using the local interpretable model-agnostic explanations (LIME) method adapted for 3D image classification. Then I “explain,” using LIME modified for multitarget 3D image regression (Image-Reg-LIME), the outputs of the regression model for estimation of the amplitudes. In Chapter 5, the results are discussed in a broader context. The contribution of this thesis is the development of explainability methods for semantic segmentation and 3D multitarget image regression of optical aberrations. The research opens the door for further enhancement of AI’s transparency.:Title Page i List of Figures xi List of Tables xv 1 Introduction 1 1.1 Essential Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.1.1 Artificial intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.1.2 Explainable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.1.3 Proposed definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2 Explainable Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2.1 Aims and applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2.2 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.3 Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3.1 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3.2 Image classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.3.3 Image regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.3.4 Image segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.4 Optics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.4.1 Aberrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 1.4.2 Zernike polynomials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 1.5 Thesis Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 1.5.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 1.5.2 Dissertation outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2 Explainable Image Segmentation 23 2.1 Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 2.3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.3.1 CAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.3.2 Grad-CAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2.3.3 U-Net . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2.3.4 Seg-Grad-CAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 2.4 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 2.4.1 Circles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 2.4.2 TextureMNIST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 2.4.3 Cityscapes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 2.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 2.5.1 Circles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 2.5.2 TextureMNIST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 2.5.3 Cityscapes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 2.6 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 2.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3 Estimation of Aberrations 55 3.1 Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 3.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 3.3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3.1 PhaseNet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3.2 PhaseNet data generator . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3.3.3 Retrieval of noise parameters . . . . . . . . . . . . . . . . . . . . . . . . 62 3.3.4 Data generator with phantoms . . . . . . . . . . . . . . . . . . . . . . . 62 3.3.5 Restoration via deconvolution . . . . . . . . . . . . . . . . . . . . . . . . 63 3.3.6 Convolution with the “zero” synthetic PSF . . . . . . . . . . . . . . . . 63 3.4 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 3.4.1 Astrocytes (synthetic data) . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.4.2 Fluorescent beads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 3.4.3 Drosophila embryo (live sample) . . . . . . . . . . . . . . . . . . . . . . 67 3.4.4 Neurons (fixed sample) . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 3.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 3.5.1 Astrocytes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 3.5.2 Conclusions on the results for astrocytes . . . . . . . . . . . . . . . . . . 74 3.5.3 Fluorescent beads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 3.5.4 Conclusions on the results for fluorescent beads . . . . . . . . . . . . . . 81 3.5.5 Drosophila embryo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 3.5.6 Conclusions on the results for Drosophila embryo . . . . . . . . . . . . . 87 3.5.7 Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 3.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 4 Explainable Multitarget Image Regression 99 4.1 Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 4.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4.3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 4.3.1 LIME . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 4.3.2 Superpixel algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 4.3.3 LIME for 3D image classification . . . . . . . . . . . . . . . . . . . . . . 104 4.3.4 Image-Reg-LIME: LIME for 3D image regression . . . . . . . . . . . . . 107 4.4 Results: Classification of Aberrations . . . . . . . . . . . . . . . . . . . . . . . . 109 viii TABLE OF CONTENTS 4.4.1 Transforming the regression task into classification . . . . . . . . . . . . 110 4.4.2 Data augmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 4.4.3 Parameter search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 4.4.4 Clustering of 3D images . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 4.4.5 Explanations of classification . . . . . . . . . . . . . . . . . . . . . . . . 114 4.4.6 Conclusions on the results for classification . . . . . . . . . . . . . . . . 117 4.5 Results: Explainable Regression of Aberrations . . . . . . . . . . . . . . . . . . 118 4.5.1 Explanations with a reference value . . . . . . . . . . . . . . . . . . . . 121 4.5.2 Validation of explanations . . . . . . . . . . . . . . . . . . . . . . . . . . 122 4.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 5 Conclusions and Outlook 127 References 12

    Optimising the NAOMI adaptive optics real-time control system

    Get PDF
    This thesis describes the author's research in the field of Real-Time Control (RTC) for Adaptive Optics (AO) instrumentation. The research encompasses experiences and knowledge gained working in the area of RTC on astronomical instrumentation projects whilst at the Optical Science Laboratories (OSL), University College London (UCL), the Isaac Newton Groups of Telescopes (ING) and the Centre for Advanced Instrumentation (СfAI), Durham University. It begins by providing an extensive introduction to the field of Astronomical Adaptive Optics covering Image Correction Theory, Atmospheric Theory, Control Theory and Adaptive Optics Component Theory. The following chapter contains a review of the current state of world wide AO instruments and facilities. The Nasmyth Adaptive Optics Multi-purpose Instrument (NAOMI), the common user AO facility at the 4.2 William Herschel Telescope (WHT), is subsequently described. Results of NAOMI component characterisation experiments are detailed to provide a system understanding of the improvement optimisation could offer. The final chapter investigates how upgrading the RTCS could increase NAOMI'S spatial and temporal performance and examines the RTCS in the context of Extremely Large Telescope (ELT) class telescopes

    Numerical aberrations compensation and polarization imaging in digital holographic microscopy

    Get PDF
    In this thesis, we describe a method for the numerical reconstruction of the complete wavefront properties from a single digital hologram: the amplitude, the phase and the polarization state. For this purpose, we present the principle of digital holographic microscopy (DHM) and the numerical reconstruction process which consists of propagating numerically a wavefront from the hologram plane to the reconstruction plane. We then define the different parameters of a Numerical Parametric Lens (NPL) introduced in the reconstruction plane that should be precisely adjusted to achieve a correct reconstruction. We demonstrate that automatic procedures not only allow to adjust these parameters, but in addition, to completely compensate for the phase aberrations. The method consists in computing directly from the hologram a NPL defined by standard or Zernike polynomials without prior knowledge of physical setup values (microscope objective focal length, distance between the object and the objective...). This method enables to reconstruct correct and accurate phase distributions, even in the presence of strong and high order aberrations. Furthermore, we show that this method allows to compensate for the curvature of specimen. The NPL parameters obtained by Zernike polynomial fit give quantitative measurements of micro-optics aberrations and the reconstructed images reveal their surface defects and roughness. Examples with micro-lenses and a metallic sphere are presented. Then, this NPL is introduced in the hologram plane and allows, as a system of optical lenses, numerical magnification, complete aberration compensation in DHM (correction of image distortions and phase aberrations) and shifting. This NPL can be automatically computed by polynomial fit, but it can also be defined by a calibration method called Reference Conjugated Hologram (RCH). We demonstrate the power of the method by the reconstruction of non-aberrated wavefronts from holograms recorded specifically with high orders aberrations introduced by a tilted thick plate, or by a cylindrical lens or by a lens ball used instead of the microscope objective. Finally, we present a modified digital holographic microscope permitting the reconstruction of the polarization state of a wavefront. The principle consists in using two reference waves polarized orthogonally that interfere with an object wave. Then, the two wavefronts are reconstructed separately from the same hologram and are processed to image the polarization state in terms of Jones vector components. Simulated and experimental data are compared to a theoretical model in order to evaluate the precision limit of the method for different polarization states of the object wave. We apply this technique to image the birefringence and the dichroism induced in a stressed polymethylmethacrylate sample (PMMA), in a bent optical fiber and in a thin concrete specimen. To evaluate the precision of the phase difference measurement in DHM design, the birefringence induced by internal stress in an optical fiber is measured and compared to the birefringence profile captured by a standard method, which had been developed to obtain high-resolution birefringence profiles of optical fibers. A 6 degrees phase difference resolution is obtained, comparable with standard imaging polariscope, but with the advantage of a single acquisition allowing real-time reconstruction
    corecore