2,136 research outputs found
Multiplicative Noise Removal Using L1 Fidelity on Frame Coefficients
We address the denoising of images contaminated with multiplicative noise,
e.g. speckle noise. Classical ways to solve such problems are filtering,
statistical (Bayesian) methods, variational methods, and methods that convert
the multiplicative noise into additive noise (using a logarithmic function),
shrinkage of the coefficients of the log-image data in a wavelet basis or in a
frame, and transform back the result using an exponential function. We propose
a method composed of several stages: we use the log-image data and apply a
reasonable under-optimal hard-thresholding on its curvelet transform; then we
apply a variational method where we minimize a specialized criterion composed
of an data-fitting to the thresholded coefficients and a Total
Variation regularization (TV) term in the image domain; the restored image is
an exponential of the obtained minimizer, weighted in a way that the mean of
the original image is preserved. Our restored images combine the advantages of
shrinkage and variational methods and avoid their main drawbacks. For the
minimization stage, we propose a properly adapted fast minimization scheme
based on Douglas-Rachford splitting. The existence of a minimizer of our
specialized criterion being proven, we demonstrate the convergence of the
minimization scheme. The obtained numerical results outperform the main
alternative methods
Bregman Cost for Non-Gaussian Noise
One of the tasks of the Bayesian inverse problem is to find a good estimate
based on the posterior probability density. The most common point estimators
are the conditional mean (CM) and maximum a posteriori (MAP) estimates, which
correspond to the mean and the mode of the posterior, respectively. From a
theoretical point of view it has been argued that the MAP estimate is only in
an asymptotic sense a Bayes estimator for the uniform cost function, while the
CM estimate is a Bayes estimator for the means squared cost function. Recently,
it has been proven that the MAP estimate is a proper Bayes estimator for the
Bregman cost if the image is corrupted by Gaussian noise. In this work we
extend this result to other noise models with log-concave likelihood density,
by introducing two related Bregman cost functions for which the CM and the MAP
estimates are proper Bayes estimators. Moreover, we also prove that the CM
estimate outperforms the MAP estimate, when the error is measured in a certain
Bregman distance, a result previously unknown also in the case of additive
Gaussian noise
Efficient parallelization on GPU of an image smoothing method based on a variational model
Medical imaging is fundamental for improvements in diagnostic accuracy. However, noise frequently corrupts the images acquired, and this can lead to erroneous diagnoses. Fortunately, image preprocessing algorithms can enhance corrupted images, particularly in noise smoothing and removal. In the medical field, time is always a very critical factor, and so there is a need for implementations which are fast and, if possible, in real time. This study presents and discusses an implementation of a highly efficient algorithm for image noise smoothing based on general purpose computing on graphics processing units techniques. The use of these techniques facilitates the quick and efficient smoothing of images corrupted by noise, even when performed on large-dimensional data sets. This is particularly relevant since GPU cards are becoming more affordable, powerful and common in medical environments
- …