23 research outputs found

    Restoration of Poissonian Images Using Alternating Direction Optimization

    Full text link
    Much research has been devoted to the problem of restoring Poissonian images, namely for medical and astronomical applications. However, the restoration of these images using state-of-the-art regularizers (such as those based on multiscale representations or total variation) is still an active research area, since the associated optimization problems are quite challenging. In this paper, we propose an approach to deconvolving Poissonian images, which is based on an alternating direction optimization method. The standard regularization (or maximum a posteriori) restoration criterion, which combines the Poisson log-likelihood with a (non-smooth) convex regularizer (log-prior), leads to hard optimization problems: the log-likelihood is non-quadratic and non-separable, the regularizer is non-smooth, and there is a non-negativity constraint. Using standard convex analysis tools, we present sufficient conditions for existence and uniqueness of solutions of these optimization problems, for several types of regularizers: total-variation, frame-based analysis, and frame-based synthesis. We attack these problems with an instance of the alternating direction method of multipliers (ADMM), which belongs to the family of augmented Lagrangian algorithms. We study sufficient conditions for convergence and show that these are satisfied, either under total-variation or frame-based (analysis and synthesis) regularization. The resulting algorithms are shown to outperform alternative state-of-the-art methods, both in terms of speed and restoration accuracy.Comment: 12 pages, 12 figures, 2 tables. Submitted to the IEEE Transactions on Image Processin

    A Deconvolution Framework with Applications in Medical and Biological Imaging

    Get PDF
    A deconvolution framework is presented in this thesis and applied to several problems in medical and biological imaging. The framework is designed to contain state of the art deconvolution methods, to be easily expandable and to combine different components arbitrarily. Deconvolution is an inverse problem and in order to cope with its ill-posed nature, suitable regularization techniques and additional restrictions are required. A main objective of deconvolution methods is to restore degraded images acquired by fluorescence microscopy which has become an important tool in biological and medical sciences. Fluorescence microscopy images are degraded by out-of-focus blurring and noise and the deconvolution algorithms to restore these images are usually called deblurring methods. Many deblurring methods were proposed to restore these images in the last decade which are part of the deconvolution framework. In addition, existing deblurring techniques are improved and new components for the deconvolution framework are developed. A considerable improvement could be obtained by combining a state of the art regularization technique with an additional non-negativity constraint. A real biological screen analysing a specific protein in human cells is presented and shows the need to analyse structural information of fluorescence images. Such an analysis requires a good image quality which is the aim of the deblurring methods if the required image quality is not given. For a reliable understanding of cells and cellular processes, high resolution 3D images of the investigated cells are necessary. However, the ability of fluorescence microscopes to image a cell in 3D is limited since the resolution along the optical axis is by a factor of three worse than the transversal resolution. Standard microscopy image deblurring techniques are able to improve the resolution but the problem of a lower resolution in direction along the optical axis remains. It is however possible to overcome this problem using Axial Tomography providing tilted views of the object by rotating it under the microscope. The rotated images contain additional information about the objects which can be used to improve the resolution along the optical axis. In this thesis, a sophisticated method to reconstruct a high resolution Axial Tomography image on basis of the developed deblurring methods is presented. The deconvolution methods are also used to reconstruct the dose distribution in proton therapy on basis of measured PET images. Positron emitters are activated by proton beams but a PET image is not directly proportional to the delivered radiation dose distribution. A PET signal can be predicted by a convolution of the planned dose with specific filter functions. In this thesis, a dose reconstruction method based on PET images which reverses the convolution approach is presented and the potential to reconstruct the actually delivered dose distribution from measured PET images is investigated. Last but not least, a new denoising method using higher-order statistic information of a given Gaussian noise signal is presented and compared to state of the art denoising methods

    Comparison of Computational Methods Developed to Address Depth-variant Imaging in Fluorescence Microscopy

    Get PDF
    In three-dimensional fluorescence microscopy, the image formation process is inherently depth variant (DV) due to the refractive index mismatch between imaging layers, which causes depth-induced spherical aberration (SA). In this study, we present a quantitative comparison among different image restoration techniques developed based on a DV imaging model for microscopy in order to assess their ability to correct SA and their impact on restoration. The imaging models approximate DV imaging by either stratifying the object space or image space. For the reconstruction purpose, we used regularized DV algorithms with object stratification method such as the Expectation Maximization (EM), Conjugate Gradient; Principal Component Analysis based expectation maximization (PCA-EM), and Inverse filtering (IF). Reconstructions from simulated data and measured data show that better restoration results are achieved with the DV PCA-EM method than the other DV algorithms in terms of execution time and restoration quality of the image

    Sparse Poisson Noisy Image Deblurring

    Get PDF
    International audienceDeblurring noisy Poisson images has recently been subject of an increasingly amount of works in many areas such as astronomy or biological imaging. In this paper, we focus on confocal microscopy which is a very popular technique for 3D imaging of biological living specimens which gives images with a very good resolution (several hundreds of nanometers), even though degraded by both blur and Poisson noise. Deconvolution methods have been proposed to reduce these degradations and we focus in this paper on techniques which promote the introduction of explicit prior on the solution. One difficulty of these techniques is to set the value of the parameter which weights the trade-off between the data term and the regularizing term. Actually, only few works have been devoted to the research of an automatic selection of this regularizing parameter when considering Poisson noise so it is often set manually such that it gives the best visual results. We present here two recent methods to estimate this regularizing parameter and we first propose an improvement of these estimators which takes advantage of confocal images. Following these estimators, we secondly propose to express the problem of Poisson noisy images deconvolution as the minimization of a new constrained problem. The proposed constrained formulation is well suited to this application domain since it is directly expressed using the anti log-likelihood of the Poisson distribution and therefore does not require any approximation. We show how to solve the unconstrained and constrained problem using the recent Alternating Direction technique and we present results on synthetic and real data using well-known priors such as Total Variation and wavelet transforms. Among these wavelet transforms, we specially focus on the Dual-Tree Complex Wavelet transform and on the dictionary composed of Curvelets and undecimated wavelet transform

    Complex wavelet regularization for 3D confocal microscopy deconvolution

    Get PDF
    Confocal microscopy is an increasingly popular technique for 3D imaging of biological specimens which gives images with a very good resolution (several tenths of micrometers), even though degraded by both blur and Poisson noise. Deconvolution methods have been proposed to reduce these degradations, some of them being regularized on a Total Variation prior, which gives good results in image restoration but does not allow to retrieve the thin details (including the textures) of the specimens. We first propose here to use instead a wavelet prior based on the Dual-Tree Complex Wavelet transform to retrieve the thin details of the object. As the regularizing prior efficiency also depends on the choice of its regularizing parameter, we secondly propose a method to select the regularizing parameter following a discrepancy principle for Poisson noise. Finally, in order to implement the proposed deconvolution method, we introduce an algorithm based on the Alternating Direction technique which allows to avoid inherent stability problems of the Richardson-Lucy multiplicative algorithm which is widely used in 3D image restoration. We show some results on real and synthetic data, and compare these results to the ones obtained with the Total Variation and the Curvelets priors. We also give preliminary results on a modification of the wavelet transform allowing to deal with the anisotropic sampling of 3D confocal images

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+αR(u)minu{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and α\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or 1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure

    Coupling Image Restoration and Segmentation: A Generalized Linear Model/Bregman Perspective

    Get PDF
    We introduce a new class of data-fitting energies that couple image segmentation with image restoration. These functionals model the image intensity using the statistical framework of generalized linear models. By duality, we establish an information-theoretic interpretation using Bregman divergences. We demonstrate how this formulation couples in a principled way image restoration tasks such as denoising, deblurring (deconvolution), and inpainting with segmentation. We present an alternating minimization algorithm to solve the resulting composite photometric/geometric inverse problem.We use Fisher scoring to solve the photometric problem and to provide asymptotic uncertainty estimates. We derive the shape gradient of our data-fitting energy and investigate convex relaxation for the geometric problem. We introduce a new alternating split- Bregman strategy to solve the resulting convex problem and present experiments and comparisons on both synthetic and real-world images
    corecore