1,070 research outputs found

    Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors

    Get PDF
    In this paper, we propose a Bayesian MAP estimator for solving the deconvolution problems when the observations are corrupted by Poisson noise. Towards this goal, a proper data fidelity term (log-likelihood) is introduced to reflect the Poisson statistics of the noise. On the other hand, as a prior, the images to restore are assumed to be positive and sparsely represented in a dictionary of waveforms such as wavelets or curvelets. Both analysis and synthesis-type sparsity priors are considered. Piecing together the data fidelity and the prior terms, the deconvolution problem boils down to the minimization of non-smooth convex functionals (for each prior). We establish the well-posedness of each optimization problem, characterize the corresponding minimizers, and solve them by means of proximal splitting algorithms originating from the realm of non-smooth convex optimization theory. Experimental results are conducted to demonstrate the potential applicability of the proposed algorithms to astronomical imaging datasets

    Wavelets, ridgelets and curvelets on the sphere

    Full text link
    We present in this paper new multiscale transforms on the sphere, namely the isotropic undecimated wavelet transform, the pyramidal wavelet transform, the ridgelet transform and the curvelet transform. All of these transforms can be inverted i.e. we can exactly reconstruct the original data from its coefficients in either representation. Several applications are described. We show how these transforms can be used in denoising and especially in a Combined Filtering Method, which uses both the wavelet and the curvelet transforms, thus benefiting from the advantages of both transforms. An application to component separation from multichannel data mapped to the sphere is also described in which we take advantage of moving to a wavelet representation.Comment: Accepted for publication in A&A. Manuscript with all figures can be downloaded at http://jstarck.free.fr/aa_sphere05.pd

    Sparse image reconstruction for molecular imaging

    Full text link
    The application that motivates this paper is molecular imaging at the atomic level. When discretized at sub-atomic distances, the volume is inherently sparse. Noiseless measurements from an imaging technology can be modeled by convolution of the image with the system point spread function (psf). Such is the case with magnetic resonance force microscopy (MRFM), an emerging technology where imaging of an individual tobacco mosaic virus was recently demonstrated with nanometer resolution. We also consider additive white Gaussian noise (AWGN) in the measurements. Many prior works of sparse estimators have focused on the case when H has low coherence; however, the system matrix H in our application is the convolution matrix for the system psf. A typical convolution matrix has high coherence. The paper therefore does not assume a low coherence H. A discrete-continuous form of the Laplacian and atom at zero (LAZE) p.d.f. used by Johnstone and Silverman is formulated, and two sparse estimators derived by maximizing the joint p.d.f. of the observation and image conditioned on the hyperparameters. A thresholding rule that generalizes the hard and soft thresholding rule appears in the course of the derivation. This so-called hybrid thresholding rule, when used in the iterative thresholding framework, gives rise to the hybrid estimator, a generalization of the lasso. Unbiased estimates of the hyperparameters for the lasso and hybrid estimator are obtained via Stein's unbiased risk estimate (SURE). A numerical study with a Gaussian psf and two sparse images shows that the hybrid estimator outperforms the lasso.Comment: 12 pages, 8 figure

    Learning Wavefront Coding for Extended Depth of Field Imaging

    Get PDF
    Depth of field is an important factor of imaging systems that highly affects the quality of the acquired spatial information. Extended depth of field (EDoF) imaging is a challenging ill-posed problem and has been extensively addressed in the literature. We propose a computational imaging approach for EDoF, where we employ wavefront coding via a diffractive optical element (DOE) and we achieve deblurring through a convolutional neural network. Thanks to the end-to-end differentiable modeling of optical image formation and computational post-processing, we jointly optimize the optical design, i.e., DOE, and the deblurring through standard gradient descent methods. Based on the properties of the underlying refractive lens and the desired EDoF range, we provide an analytical expression for the search space of the DOE, which is instrumental in the convergence of the end-to-end network. We achieve superior EDoF imaging performance compared to the state of the art, where we demonstrate results with minimal artifacts in various scenarios, including deep 3D scenes and broadband imaging

    A hybrid algorithm for spatial and wavelet domain image restoration

    Get PDF
    The recent algorithm ForWaRD based on the two steps: (i) the Fourier domain deblurring and (ii) wavelet domain denoising, shows better restoration results than those using traditional image restoration methods. In this paper, we study other deblurring schemes in ForWaRD and demonstrate such two-step approach is effective for image restoration.published_or_final_versionS P I E Conference on Visual Communications and Image Processing 2005, Beijing, China, 12-15 July 2005. In Proceedings Of Spie - The International Society For Optical Engineering, 2005, v. 5960 n. 4, p. 59605V-1 - 59605V-

    A SURE Approach for Digital Signal/Image Deconvolution Problems

    Get PDF
    In this paper, we are interested in the classical problem of restoring data degraded by a convolution and the addition of a white Gaussian noise. The originality of the proposed approach is two-fold. Firstly, we formulate the restoration problem as a nonlinear estimation problem leading to the minimization of a criterion derived from Stein's unbiased quadratic risk estimate. Secondly, the deconvolution procedure is performed using any analysis and synthesis frames that can be overcomplete or not. New theoretical results concerning the calculation of the variance of the Stein's risk estimate are also provided in this work. Simulations carried out on natural images show the good performance of our method w.r.t. conventional wavelet-based restoration methods

    Enrichment of Turbulence Field Using Wavelets

    Get PDF
    This thesis is composed of two parts. The first part presents a new turbulence generation method based on stochastic wavelets and tests various properties of the generated turbulence field in both the homogeneous and inhomogeneous cases. Numerical results indicate that turbulence fields can be generated with much smaller bases in comparison to synthetic Fourier methods while maintaining comparable accuracy. Adaptive generation of inhomogeneous turbulence is achieved by a scale reduction algorithm, which greatly reduces the computational cost and practically introduces no error. The generating formula proposed in this research could be adjusted to generate fully inhomogeneous and anisotropic turbulence with given RANS data under a divergence-free constraint, which was not achieved previously in similar research. Numerical examples show that the generated homogeneous and inhomogeneous turbulence are in good agreement with the input data and theoretical results. The second part presents a framework of solving turbulence deconvolution problems using optimization techniques on Riemannian manifolds. A filtered velocity field was deconvoluted without any information of the filter. The deconvolution results shows high accuracy compared with the original velocity field. The computational cost of the optimization problem was largely reduced using wavelet representation while still maintaining high accuracy. Utilization of divergence-free wavelets ensures the incompressible property of deconvolution results, which was barely achieved in previous research
    corecore