5,913 research outputs found

    Recent Progress in Image Deblurring

    Full text link
    This paper comprehensively reviews the recent development of image deblurring, including non-blind/blind, spatially invariant/variant deblurring techniques. Indeed, these techniques share the same objective of inferring a latent sharp image from one or several corresponding blurry images, while the blind deblurring techniques are also required to derive an accurate blur kernel. Considering the critical role of image restoration in modern imaging systems to provide high-quality images under complex environments such as motion, undesirable lighting conditions, and imperfect system components, image deblurring has attracted growing attention in recent years. From the viewpoint of how to handle the ill-posedness which is a crucial issue in deblurring tasks, existing methods can be grouped into five categories: Bayesian inference framework, variational methods, sparse representation-based methods, homography-based modeling, and region-based methods. In spite of achieving a certain level of development, image deblurring, especially the blind case, is limited in its success by complex application conditions which make the blur kernel hard to obtain and be spatially variant. We provide a holistic understanding and deep insight into image deblurring in this review. An analysis of the empirical evidence for representative methods, practical issues, as well as a discussion of promising future directions are also presented.Comment: 53 pages, 17 figure

    Maximum likelihood estimation of blood velocity using Doppler optical coherence tomography

    Get PDF
    A recent trend in optical coherence tomography (OCT) hardware has been the move towards higher A-scan rates. However, the estimation of axial blood flow velocities is affected by the presence and type of noise, as well as the estimation method. Higher acquisition rates alone do not enable the accurate quantification of axial blood velocity. Moreover, decorrelation is an unavoidable feature of OCT signals when there is motion relative to the OCT beam. For in-vivo OCT measurements of blood flow, decorrelation noise affects Doppler frequency estimation by broadening the signal spectrum. Here we derive a maximum likelihood estimator (MLE) for Doppler frequency estimation that takes into account spectral broadening due to decorrelation. We compare this estimator with existing techniques. Both theory and experiment show that this estimator is effective, and outperforms the Kasai and additive white Gaussian noise (AWGN) ML estimators. We find that maximum likelihood estimation can be useful for estimating Doppler shifts for slow axial flow and near transverse flow. Due to the inherent linear relationship between decorrelation and Doppler shift of scatterers moving relative to an OCT beam, decorrelation itself may be a measure of flow speed.published_or_final_versio

    Maximum Likelihood Estimation for Single Particle, Passive Microrheology Data with Drift

    Get PDF
    Volume limitations and low yield thresholds of biological fluids have led to widespread use of passive microparticle rheology. The mean-squared-displacement (MSD) statistics of bead position time series (bead paths) are either applied directly to determine the creep compliance [Xu et al (1998)] or transformed to determine dynamic storage and loss moduli [Mason & Weitz (1995)]. A prevalent hurdle arises when there is a non-diffusive experimental drift in the data. Commensurate with the magnitude of drift relative to diffusive mobility, quantified by a P\'eclet number, the MSD statistics are distorted, and thus the path data must be "corrected" for drift. The standard approach is to estimate and subtract the drift from particle paths, and then calculate MSD statistics. We present an alternative, parametric approach using maximum likelihood estimation that simultaneously fits drift and diffusive model parameters from the path data; the MSD statistics (and consequently the compliance and dynamic moduli) then follow directly from the best-fit model. We illustrate and compare both methods on simulated path data over a range of P\'eclet numbers, where exact answers are known. We choose fractional Brownian motion as the numerical model because it affords tunable, sub-diffusive MSD statistics consistent with typical 30 second long, experimental observations of microbeads in several biological fluids. Finally, we apply and compare both methods on data from human bronchial epithelial cell culture mucus.Comment: 29 pages, 12 figure

    Comparison of Kasai autocorrelation and maximum likelihood estimators for Doppler optical coherence tomography

    Get PDF
    published_or_final_versio

    CELLO: A fast algorithm for Covariance Estimation

    Get PDF
    We present CELLO (Covariance Estimation and Learning through Likelihood Optimization), an algorithm for predicting the covariances of measurements based on any available informative features. This algorithm is intended to improve the accuracy and reliability of on-line state estimation by providing a principled way to extend the conventional fixed-covariance Gaussian measurement model. We show that in experiments, CELLO learns to predict measurement covariances that agree with empirical covariances obtained by manually annotating sensor regimes. We also show that using the learned covariances during filtering provides substantial quantitative improvement to the overall state estimate. © 2013 IEEE.United States. National Aeronautics and Space AdministrationSiemens Corporate ResearchUnited States. Office of Naval Research. Multidisciplinary University Research InitiativeMicro Autonomous Consortium Systems and Technolog

    Single Particle Tracking: Analysis Techniques for Live Cell Nanoscopy.

    Get PDF
    Single molecule experiments are a set of experiments designed specifically to study the properties of individual molecules. It has only been in the last three decades where single molecule experiments have been applied to the life sciences; where they have been successfully implemented in systems biology for probing the behaviors of sub-cellular mechanisms. The advent and growth of super-resolution techniques in single molecule experiments has made the fundamental behaviors of light and the associated nano-probes a necessary concern among life scientists wishing to advance the state of human knowledge in biology. This dissertation disseminates some of the practices learned in experimental live cell microscopy. The topic of single particle tracking is addressed here in a format that is designed for the physicist who embarks upon single molecule studies. Specifically, the focus is on the necessary procedures to generate single particle tracking analysis techniques that can be implemented to answer biological questions. These analysis techniques range from designing and testing a particle tracking algorithm to inferring model parameters once an image has been processed. The intellectual contributions of the author include the techniques in diffusion estimation, localization filtering, and trajectory associations for tracking which will all be discussed in detail in later chapters. The author of this thesis has also contributed to the software development of automated gain calibration, live cell particle simulations, and various single particle tracking packages. Future work includes further evaluation of this laboratory\u27s single particle tracking software, entropy based approaches towards hypothesis validations, and the uncertainty quantification of gain calibration

    Optimizing experimental parameters for tracking of diffusing particles

    Get PDF
    We describe how a single-particle tracking experiment should be designed in order for its recorded trajectories to contain the most information about a tracked particle's diffusion coefficient. The precision of estimators for the diffusion coefficient is affected by motion blur, limited photon statistics, and the length of recorded time-series. We demonstrate for a particle undergoing free diffusion that precision is negligibly affected by motion blur in typical experiments, while optimizing photon counts and the number of recorded frames is the key to precision. Building on these results, we describe for a wide range of experimental scenarios how to choose experimental parameters in order to optimize the precision. Generally, one should choose quantity over quality: experiments should be designed to maximize the number of frames recorded in a time-series, even if this means lower information content in individual frames
    • …
    corecore