423 research outputs found

    Enhanced hyperspectral tomography for bioimaging by spatiospectral reconstruction.

    Get PDF
    From Europe PMC via Jisc Publications RouterHistory: ppub 2021-10-01, epub 2021-10-21Publication status: PublishedHere we apply hyperspectral bright field imaging to collect computed tomographic images with excellent energy resolution (~ 1 keV), applying it for the first time to map the distribution of stain in a fixed biological sample through its characteristic K-edge. Conventionally, because the photons detected at each pixel are distributed across as many as 200 energy channels, energy-selective images are characterised by low count-rates and poor signal-to-noise ratio. This means high X-ray exposures, long scan times and high doses are required to image unique spectral markers. Here, we achieve high quality energy-dispersive tomograms from low dose, noisy datasets using a dedicated iterative reconstruction algorithm. This exploits the spatial smoothness and inter-channel structural correlation in the spectral domain using two carefully chosen regularisation terms. For a multi-phase phantom, a reduction in scan time of 36 times is demonstrated. Spectral analysis methods including K-edge subtraction and absorption step-size fitting are evaluated for an ex vivo, single (iodine)-stained biological sample, where low chemical concentration and inhomogeneous distribution can affect soft tissue segmentation and visualisation. The reconstruction algorithms are available through the open-source Core Imaging Library. Taken together, these tools offer new capabilities for visualisation and elemental mapping, with promising applications for multiply-stained biological specimens

    3D exemplar-based image inpainting in electron microscopy

    Get PDF
    In electron microscopy (EM) a common problem is the non-availability of data, which causes artefacts in reconstructions. In this thesis the goal is to generate artificial data where missing in EM by using exemplar-based inpainting (EBI). We implement an accelerated 3D version tailored to applications in EM, which reduces reconstruction times from days to minutes. We develop intelligent sampling strategies to find optimal data as input for reconstruction methods. Further, we investigate approaches to reduce electron dose and acquisition time. Sparse sampling followed by inpainting is the most promising approach. As common evaluation measures may lead to misinterpretation of results in EM and falsify a subsequent analysis, we propose to use application driven metrics and demonstrate this in a segmentation task. A further application of our technique is the artificial generation of projections in tiltbased EM. EBI is used to generate missing projections, such that the full angular range is covered. Subsequent reconstructions are significantly enhanced in terms of resolution, which facilitates further analysis of samples. In conclusion, EBI proves promising when used as an additional data generation step to tackle the non-availability of data in EM, which is evaluated in selected applications. Enhancing adaptive sampling methods and refining EBI, especially considering the mutual influence, promotes higher throughput in EM using less electron dose while not lessening quality.Ein häufig vorkommendes Problem in der Elektronenmikroskopie (EM) ist die Nichtverfügbarkeit von Daten, was zu Artefakten in Rekonstruktionen führt. In dieser Arbeit ist es das Ziel fehlende Daten in der EM künstlich zu erzeugen, was durch Exemplar-basiertes Inpainting (EBI) realisiert wird. Wir implementieren eine auf EM zugeschnittene beschleunigte 3D Version, welche es ermöglicht, Rekonstruktionszeiten von Tagen auf Minuten zu reduzieren. Wir entwickeln intelligente Abtaststrategien, um optimale Datenpunkte für die Rekonstruktion zu erhalten. Ansätze zur Reduzierung von Elektronendosis und Aufnahmezeit werden untersucht. Unterabtastung gefolgt von Inpainting führt zu den besten Resultaten. Evaluationsmaße zur Beurteilung der Rekonstruktionsqualität helfen in der EM oft nicht und können zu falschen Schlüssen führen, weswegen anwendungsbasierte Metriken die bessere Wahl darstellen. Dies demonstrieren wir anhand eines Beispiels. Die künstliche Erzeugung von Projektionen in der neigungsbasierten Elektronentomographie ist eine weitere Anwendung. EBI wird verwendet um fehlende Projektionen zu generieren. Daraus resultierende Rekonstruktionen weisen eine deutlich erhöhte Auflösung auf. EBI ist ein vielversprechender Ansatz, um nicht verfügbare Daten in der EM zu generieren. Dies wird auf Basis verschiedener Anwendungen gezeigt und evaluiert. Adaptive Aufnahmestrategien und EBI können also zu einem höheren Durchsatz in der EM führen, ohne die Bildqualität merklich zu verschlechtern

    The Second Hungarian Workshop on Image Analysis : Budapest, June 7-9, 1988.

    Get PDF

    Holistic improvement of image acquisition and reconstruction in fluorescence microscopy

    Get PDF
    Recent developments in microscopic imaging led to a better understanding of intra- and intercellular metabolic processes and, for example, to visualize structural properties of viral pathogens. In this thesis, the imaging process of widefield and confocal scanning microscopy techniques is treated holistically to highlight general strategies and maximise their information content. Poisson or shot noise is assumed to be the fundamental noise process for the given measurements. A stable focus position is a basic condition for e.g. long-term measurements in order to provide reliable information about potential changes inside the Field-of-view. While newer microscopy systems can be equipped with hardware autofocus, this is not yet the widespread standard. For image-based focus analysis, different metrics for ideal, noisy and aberrated, in case of spherical aberration and astigmatism, measurements are presented. A stable focus position is also relevant in the example of 2-photon confocal imaging and at the same time the situation is aggravated in the given example, the measurement of the retina in the living mouse. In addition to the natural drift of the focal position, which can be evaluated by means of previously introduced metrics, rhythmic heartbeat, respiration, unrhythmic muscle twitching and movement of the mouse kept in artificial sleep are added. A dejittering algorithm is presented for the measurement data obtained under these circumstances. Using the additional information about the sample distribution in ISM, a method for reconstructing 3D from 2D image data is presented in the form of thick slice unmixing. This method can further be used for suppression of light generated outside the focal layer of 3D data stacks and is compared to selective layer multi-view deconvolution. To reduce phototoxicity and save valuable measurement time for a 3D stack, the method of zLEAP is presented, by which omitted Z-planes are subsequently calculated and inserted

    Astronomical use of television-type image sensors

    Get PDF
    Conference on using TV type image sensors in astronomical photometr

    Fractal Analysis of Microstructural and Fractograpghic Images for Evaluation of Materials

    Get PDF
    Materials have hierarchically organized complex structures at different length scales. Quantitative description of material behaviour is dependent on four fundamental length scales [1], which are of concern to materials scientists. These are (1) nano scale, 1-103 nm, (2)micro scale, 1-10 3 μm, (3) macro scale, 1-103mm, and (4) global size scale, 1-106 m. While the nano scale corresponds to, often, highly ordered atomic structures, the global size scale relates geophysical phenomena and large man made engineering structures. Micro scale and macro scale correspond to size of material samples used in laboratories, for designing and for fabrication of miniature to small machineries

    Mapping and Deep Analysis of Image Dehazing: Coherent Taxonomy, Datasets, Open Challenges, Motivations, and Recommendations

    Get PDF
    Our study aims to review and analyze the most relevant studies in the image dehazing field. Many aspects have been deemed necessary to provide a broad understanding of various studies that have been examined through surveying the existing literature. These aspects are as follows: datasets that have been used in the literature, challenges that other researchers have faced, motivations, and recommendations for diminishing the obstacles in the reported literature. A systematic protocol is employed to search all relevant articles on image dehazing, with variations in keywords, in addition to searching for evaluation and benchmark studies. The search process is established on three online databases, namely, IEEE Xplore, Web of Science (WOS), and ScienceDirect (SD), from 2008 to 2021. These indices are selected because they are sufficient in terms of coverage. Along with definition of the inclusion and exclusion criteria, we include 152 articles to the final set. A total of 55 out of 152 articles focused on various studies that conducted image dehazing, and 13 out 152 studies covered most of the review papers based on scenarios and general overviews. Finally, most of the included articles centered on the development of image dehazing algorithms based on real-time scenario (84/152) articles. Image dehazing removes unwanted visual effects and is often considered an image enhancement technique, which requires a fully automated algorithm to work under real-time outdoor applications, a reliable evaluation method, and datasets based on different weather conditions. Many relevant studies have been conducted to meet these critical requirements. We conducted objective image quality assessment experimental comparison of various image dehazing algorithms. In conclusions unlike other review papers, our study distinctly reflects different observations on image dehazing areas. We believe that the result of this study can serve as a useful guideline for practitioners who are looking for a comprehensive view on image dehazing
    corecore