37 research outputs found

    Spatial resolution improved fluorescence lifetime imaging via deep learning

    Get PDF
    We present a deep learning approach to obtain high-resolution (HR) fluorescence lifetime images from low-resolution (LR) images acquired from Fluorescence Lifetime IMaging (FLIM) systems. We first proposed a theoretical method for training neural networks to generate massive semi-synthetic FLIM data with various cellular morphologies, a sizeable dynamic lifetime range, and complex decay components. We then developed a degrading model to obtain LR-HR pairs and created a hybrid neural network, the Spatial Resolution Improved FLIM net (SRI-FLIMnet), to simultaneously estimate fluorescence lifetimes and realize the nonlinear transformation from LR to HR images. The evaluative results demonstrate SRI-FLIMnet’s superior performance in reconstructing spatial information from limited pixel resolution. We also verified SRI-FLIMnet using experimental images of bacterial infected mouse raw macrophage cells. Results show that the proposed data generation method and SRIFLIMnet efficiently achieve superior spatial resolution for FLIM applications. Our study provides a solution for fast obtaining HR FLIM images

    Compact and robust deep learning architecture for fluorescence lifetime imaging and FPGA implementation

    Get PDF
    This paper reports a bespoke adder-based deep learning network for time-domain fluorescence lifetime imaging (FLIM). By leveraging the l1-norm extraction method, we propose a 1-D Fluorescence Lifetime AdderNet (FLAN) without multiplication-based convolutions to reduce the computational complexity. Further, we compressed fluorescence decays in temporal dimension using a log-scale merging technique to discard redundant temporal information derived as log-scaling FLAN (FLAN+LS). FLAN+LS achieves 0.11 and 0.23 compression ratios compared with FLAN and a conventional 1-D convolutional neural network (1-D CNN) while maintaining high accuracy in retrieving lifetimes. We extensively evaluated FLAN and FLAN+LS using synthetic and real data. A traditional fitting method and other non-fitting, high-accuracy algorithms were compared with our networks for synthetic data. Our networks attained a minor reconstruction error in different photon-count scenarios. For real data, we used fluorescent beads' data acquired by a confocal microscope to validate the effectiveness of real fluorophores, and our networks can differentiate beads with different lifetimes. Additionally, we implemented the network architecture on a field-programmable gate array (FPGA) with a post-quantization technique to shorten the bit-width, thereby improving computing efficiency. FLAN+LS on hardware achieves the highest computing efficiency compared to 1-D CNN and FLAN. We also discussed the applicability of our network and hardware architecture for other time-resolved biomedical applications using photon-efficient, time-resolved sensor

    Fast analysis of time‐domain fluorescence lifetime imaging via extreme learning machine

    Get PDF
    We present a fast and accurate analytical method for fluorescence lifetime imaging microscopy (FLIM), using the extreme learning machine (ELM). We used extensive metrics to evaluate ELM and existing algorithms. First, we compared these algorithms using synthetic datasets. The results indicate that ELM can obtain higher fidelity, even in low-photon conditions. Afterwards, we used ELM to retrieve lifetime components from human prostate cancer cells loaded with gold nanosensors, showing that ELM also outperforms the iterative fitting and non-fitting algorithms. By comparing ELM with a computational efficient neural network, ELM achieves comparable accuracy with less training and inference time. As there is no back-propagation process for ELM during the training phase, the training speed is much higher than existing neural network approaches. The proposed strategy is promising for edge computing with online training

    Simple and robust deep learning approach for fast fluorescence lifetime imaging

    Get PDF
    Fluorescence lifetime imaging (FLIM) is a powerful tool that provides unique quantitative information for biomedical research. In this study, we propose a multi-layer-perceptron-based mixer (MLP-Mixer) deep learning (DL) algorithm named FLIM-MLP-Mixer for fast and robust FLIM analysis. The FLIM-MLP-Mixer has a simple network architecture yet a powerful learning ability from data. Compared with the traditional fitting and previously reported DL methods, the FLIM-MLP-Mixer shows superior performance in terms of accuracy and calculation speed, which has been validated using both synthetic and experimental data. All results indicate that our proposed method is well suited for accurately estimating lifetime parameters from measured fluorescence histograms, and it has great potential in various real-time FLIM applications

    Dynamic fluorescence lifetime sensing with CMOS single-photon avalanche diode arrays and deep learning processors

    Get PDF
    Measuring fluorescence lifetimes of fast-moving cells or particles have broad applications in biomedical sciences. This paper presents a dynamic fluorescence lifetime sensing (DFLS) system based on the time-correlated single-photon counting (TCSPC) principle. It integrates a CMOS 192 × 128 single-photon avalanche diode (SPAD) array, offering an enormous photon-counting throughput without pile-up effects. We also proposed a quantized convolutional neural network (QCNN) algorithm and designed a field-programmable gate array embedded processor for fluorescence lifetime determinations. The processor uses a simple architecture, showing unparallel advantages in accuracy, analysis speed, and power consumption. It can resolve fluorescence lifetimes against disturbing noise. We evaluated the DFLS system using fluorescence dyes and fluorophore-tagged microspheres. The system can effectively measure fluorescence lifetimes within a single exposure period of the SPAD sensor, paving the way for portable time-resolved devices and shows potential in various applications

    Hardware inspired neural network for efficient time-resolved biomedical imaging

    Get PDF
    Convolutional neural networks (CNN) have revealed exceptional performance for fluorescence lifetime imaging (FLIM). However, redundant parameters and complicated topologies make it challenging to implement such networks on embedded hardware to achieve real-time processing. We report a lightweight, quantized neural architecture that can offer fast FLIM imaging. The forward-propagation is significantly simplified by replacing matrix multiplications in each convolution layer with additions and data quantization using a low bit-width. We first used synthetic 3-D lifetime data with given lifetime ranges and photon counts to assure correct average lifetimes can be obtained. Afterwards, human prostatic cancer cells incubated with gold nanoprobes were utilized to validate the feasibility of the network for real-world data. The quantized network yielded a 37.8% compression ratio without performance degradation. Clinical relevance - This neural network can be applied to diagnose cancer early based on fluorescence lifetime in a non-invasive way. This approach brings high accuracy and accelerates diagnostic processes for clinicians who are not experts in biomedical signal processin

    Smart wide-field fluorescence lifetime imaging system with CMOS single-photon avalanche diode arrays

    Get PDF
    Wide-field fluorescence lifetime imaging (FLIM) is a promising technique for biomedical and clinic applications. Integrating with CMOS single-photon avalanche diode (SPAD) sensor arrays can lead to cheaper and portable real-time FLIM systems. However, the FLIM data obtained by such sensor systems often have sophisticated noise features. There is still a lack of fast tools to recover lifetime parameters from highly noise-corrupted fluorescence signals efficiently. This paper proposes a smart wide-field FLIM system containing a 192×128 COMS SPAD sensor and a field-programmable gate array (FPGA) embedded deep learning (DL) FLIM processor. The processor adopts a hardware-friendly and light-weighted neural network for fluorescence lifetime analysis, showing the advantages of high accuracy against noise, fast speed, and low power consumption. Experimental results demonstrate the proposed system's superior and robust performances, promising for many FLIM applications such as FLIM-guided clinical surgeries, cancer diagnosis, and biomedical imagin

    NON-HERMITIAN ELECTRONICS FOR WIRELESS BIOMEDICAL SENSING

    No full text
    Ph.DDOCTOR OF PHILOSOPHY (FOE
    corecore