2,402 research outputs found

    Poisson noise reduction with non-local PCA

    Full text link
    Photon-limited imaging arises when the number of photons collected by a sensor array is small relative to the number of detector elements. Photon limitations are an important concern for many applications such as spectral imaging, night vision, nuclear medicine, and astronomy. Typically a Poisson distribution is used to model these observations, and the inherent heteroscedasticity of the data combined with standard noise removal methods yields significant artifacts. This paper introduces a novel denoising algorithm for photon-limited images which combines elements of dictionary learning and sparse patch-based representations of images. The method employs both an adaptation of Principal Component Analysis (PCA) for Poisson noise and recently developed sparsity-regularized convex optimization algorithms for photon-limited images. A comprehensive empirical evaluation of the proposed method helps characterize the performance of this approach relative to other state-of-the-art denoising methods. The results reveal that, despite its conceptual simplicity, Poisson PCA-based denoising appears to be highly competitive in very low light regimes.Comment: erratum: Image man is wrongly name pepper in the journal versio

    Particle detection and tracking in fluorescence time-lapse imaging: a contrario approach

    Full text link
    This paper proposes a probabilistic approach for the detection and the tracking of particles in fluorescent time-lapse imaging. In the presence of a very noised and poor-quality data, particles and trajectories can be characterized by an a contrario model, that estimates the probability of observing the structures of interest in random data. This approach, first introduced in the modeling of human visual perception and then successfully applied in many image processing tasks, leads to algorithms that neither require a previous learning stage, nor a tedious parameter tuning and are very robust to noise. Comparative evaluations against a well-established baseline show that the proposed approach outperforms the state of the art.Comment: Published in Journal of Machine Vision and Application

    A proximal iteration for deconvolving Poisson noisy images using sparse representations

    Get PDF
    We propose an image deconvolution algorithm when the data is contaminated by Poisson noise. The image to restore is assumed to be sparsely represented in a dictionary of waveforms such as the wavelet or curvelet transforms. Our key contributions are: First, we handle the Poisson noise properly by using the Anscombe variance stabilizing transform leading to a {\it non-linear} degradation equation with additive Gaussian noise. Second, the deconvolution problem is formulated as the minimization of a convex functional with a data-fidelity term reflecting the noise properties, and a non-smooth sparsity-promoting penalties over the image representation coefficients (e.g. â„“1\ell_1-norm). Third, a fast iterative backward-forward splitting algorithm is proposed to solve the minimization problem. We derive existence and uniqueness conditions of the solution, and establish convergence of the iterative algorithm. Finally, a GCV-based model selection procedure is proposed to objectively select the regularization parameter. Experimental results are carried out to show the striking benefits gained from taking into account the Poisson statistics of the noise. These results also suggest that using sparse-domain regularization may be tractable in many deconvolution applications with Poisson noise such as astronomy and microscopy

    Undecimated haar thresholding for poisson intensity estimation

    Full text link

    Toward quantitative super-resolution microscopy: molecular maps with statistical guarantees

    Get PDF
    Quantifying the number of molecules from fluorescence microscopy measurements is an important topic in cell biology and medical research. In this work, we present a consecutive algorithm for super-resolution (stimulated emission depletion (STED)) scanning microscopy that provides molecule counts in automatically generated image segments and offers statistical guarantees in form of asymptotic confidence intervals. To this end, we first apply a multiscale scanning procedure on STED microscopy measurements of the sample to obtain a system of significant regions, each of which contains at least one molecule with prescribed uniform probability. This system of regions will typically be highly redundant and consists of rectangular building blocks. To choose an informative but non-redundant subset of more naturally shaped regions, we hybridize our system with the result of a generic segmentation algorithm. The diameter of the segments can be of the order of the resolution of the microscope. Using multiple photon coincidence measurements of the same sample in confocal mode, we are then able to estimate the brightness and number of molecules and give uniform confidence intervals on the molecule counts for each previously constructed segment. In other words, we establish a so-called molecular map with uniform error control. The performance of the algorithm is investigated on simulated and real data
    • …
    corecore