23 research outputs found

    An algorithm for hybrid regularizers based image restoration with Poisson noise

    Get PDF
    summary:In this paper, a hybrid regularizers model for Poissonian image restoration is introduced. We study existence and uniqueness of minimizer for this model. To solve the resulting minimization problem, we employ the alternating minimization method with rigorous convergence guarantee. Numerical results demonstrate the efficiency and stability of the proposed method for suppressing Poisson noise

    Generalized Forward-Backward Splitting

    Full text link
    This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form F+∑i=1nGiF + \sum_{i=1}^n G_i, where FF has a Lipschitz-continuous gradient and the GiG_i's are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than n=1n = 1 non-smooth function, our method generalizes it to the case of arbitrary nn. Our method makes an explicit use of the regularity of FF in the forward step, and the proximity operators of the GiG_i's are applied in parallel in the backward step. This allows the generalized forward backward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of FF. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.Comment: 24 pages, 4 figure

    Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors

    Get PDF
    In this paper, we propose a Bayesian MAP estimator for solving the deconvolution problems when the observations are corrupted by Poisson noise. Towards this goal, a proper data fidelity term (log-likelihood) is introduced to reflect the Poisson statistics of the noise. On the other hand, as a prior, the images to restore are assumed to be positive and sparsely represented in a dictionary of waveforms such as wavelets or curvelets. Both analysis and synthesis-type sparsity priors are considered. Piecing together the data fidelity and the prior terms, the deconvolution problem boils down to the minimization of non-smooth convex functionals (for each prior). We establish the well-posedness of each optimization problem, characterize the corresponding minimizers, and solve them by means of proximal splitting algorithms originating from the realm of non-smooth convex optimization theory. Experimental results are conducted to demonstrate the potential applicability of the proposed algorithms to astronomical imaging datasets

    Proximity Operators of Discrete Information Divergences

    Get PDF
    Information divergences allow one to assess how close two distributions are from each other. Among the large panel of available measures, a special attention has been paid to convex φ\varphi-divergences, such as Kullback-Leibler, Jeffreys-Kullback, Hellinger, Chi-Square, Renyi, and Iα_{\alpha} divergences. While φ\varphi-divergences have been extensively studied in convex analysis, their use in optimization problems often remains challenging. In this regard, one of the main shortcomings of existing methods is that the minimization of φ\varphi-divergences is usually performed with respect to one of their arguments, possibly within alternating optimization techniques. In this paper, we overcome this limitation by deriving new closed-form expressions for the proximity operator of such two-variable functions. This makes it possible to employ standard proximal methods for efficiently solving a wide range of convex optimization problems involving φ\varphi-divergences. In addition, we show that these proximity operators are useful to compute the epigraphical projection of several functions of practical interest. The proposed proximal tools are numerically validated in the context of optimal query execution within database management systems, where the problem of selectivity estimation plays a central role. Experiments are carried out on small to large scale scenarios

    Variable Splitting as a Key to Efficient Image Reconstruction

    Get PDF
    The problem of reconstruction of digital images from their degraded measurements has always been a problem of central importance in numerous applications of imaging sciences. In real life, acquired imaging data is typically contaminated by various types of degradation phenomena which are usually related to the imperfections of image acquisition devices and/or environmental effects. Accordingly, given the degraded measurements of an image of interest, the fundamental goal of image reconstruction is to recover its close approximation, thereby "reversing" the effect of image degradation. Moreover, the massive production and proliferation of digital data across different fields of applied sciences creates the need for methods of image restoration which would be both accurate and computationally efficient. Developing such methods, however, has never been a trivial task, as improving the accuracy of image reconstruction is generally achieved at the expense of an elevated computational burden. Accordingly, the main goal of this thesis has been to develop an analytical framework which allows one to tackle a wide scope of image reconstruction problems in a computationally efficient manner. To this end, we generalize the concept of variable splitting, as a tool for simplifying complex reconstruction problems through their replacement by a sequence of simpler and therefore easily solvable ones. Moreover, we consider two different types of variable splitting and demonstrate their connection to a number of existing approaches which are currently used to solve various inverse problems. In particular, we refer to the first type of variable splitting as Bregman Type Splitting (BTS) and demonstrate its applicability to the solution of complex reconstruction problems with composite, cross-domain constraints. As specific applications of practical importance, we consider the problem of reconstruction of diffusion MRI signals from sub-critically sampled, incomplete data as well as the problem of blind deconvolution of medical ultrasound images. Further, we refer to the second type of variable splitting as Fuzzy Clustering Splitting (FCS) and show its application to the problem of image denoising. Specifically, we demonstrate how this splitting technique allows us to generalize the concept of neighbourhood operation as well as to derive a unifying approach to denoising of imaging data under a variety of different noise scenarios

    AI for time-resolved imaging: from fluorescence lifetime to single-pixel time of flight

    Get PDF
    Time-resolved imaging is a field of optics which measures the arrival time of light on the camera. This thesis looks at two time-resolved imaging modalities: fluorescence lifetime imaging and time-of-flight measurement for depth imaging and ranging. Both of these applications require temporal accuracy on the order of pico- or nanosecond (10−12 − 10−9s) scales. This demands special camera technology and optics that can sample light-intensity extremely quickly, much faster than an ordinary video camera. However, such detectors can be very expensive compared to regular cameras while offering lower image quality. Further, information of interest is often hidden (encoded) in the raw temporal data. Therefore, computational imaging algorithms are used to enhance, analyse and extract information from time-resolved images. "A picture is worth a thousand words". This describes a fundamental blessing and curse of image analysis: images contain extreme amounts of data. Consequently, it is very difficult to design algorithms that encompass all the possible pixel permutations and combinations that can encode this information. Fortunately, the rise of AI and machine learning (ML) allow us to instead create algorithms in a data-driven way. This thesis demonstrates the application of ML to time-resolved imaging tasks, ranging from parameter estimation in noisy data and decoding of overlapping information, through super-resolution, to inferring 3D information from 1D (temporal) data

    Radio Astronomy Image Reconstruction in the Big Data Era

    Get PDF
    Next generation radio interferometric telescopes pave the way for the future of radio astronomy with extremely wide-fields of view and precision polarimetry not possible at other optical wavelengths, with the required cost of image reconstruction. These instruments will be used to map large scale Galactic and extra-galactic structures at higher resolution and fidelity than ever before. However, radio astronomy has entered the era of big data, limiting the expected sensitivity and fidelity of the instruments due to the large amounts of data. New image reconstruction methods are critical to meet the data requirements needed to obtain new scientific discoveries in radio astronomy. To meet this need, this work takes traditional radio astronomical imaging and introduces new of state-of-the-art image reconstruction frameworks of sparse image reconstruction algorithms. The software package PURIFY, developed in this work, uses convex optimization algorithms (i.e. alternating direction method of multipliers) to solve for the reconstructed image. We design, implement, and apply distributed radio interferometric image reconstruction methods for the message passing interface (MPI), showing that PURIFY scales to big data image reconstruction on computing clusters. We design a distributed wide-field imaging algorithm for non-coplanar arrays, while providing new theoretical insights for wide-field imaging. It is shown that PURIFY’s methods provide higher dynamic range than traditional image reconstruction methods, providing a more accurate and detailed sky model for real observations. This sets the stage for state-of-the-art image reconstruction methods to be distributed and applied to next generation interferometric telescopes, where they can be used to meet big data challenges and to make new scientific discoveries in radio astronomy and astrophysics
    corecore