6,101 research outputs found
Iterative algorithms based on decoupling of deblurring and denoising for image restoration
In this paper, we propose iterative algorithms for solving image restoration problems. The iterative algorithms are based on decoupling of deblurring and denoising steps in the restoration process. In the deblurring step, an efficient deblurring method using fast transforms can be employed. In the denoising step, effective methods such as the wavelet shrinkage denoising method or the total variation denoising method can be used. The main advantage of this proposal is that the resulting algorithms can be very efficient and can produce better restored images in visual quality and signalto-noise ratio than those by the restoration methods using the combination of a data-fitting term and a regularization term. The convergence of the proposed algorithms is shown in the paper. Numerical examples are also given to demonstrate the effectiveness of these algorithms. Ā© 2008 Society for Industrial and Applied Mathematics.published_or_final_versio
Inexact Bregman iteration with an application to Poisson data reconstruction
This work deals with the solution of image restoration problems by an
iterative regularization method based on the Bregman iteration. Any iteration of this
scheme requires to exactly compute the minimizer of a function. However, in some
image reconstruction applications, it is either impossible or extremely expensive to
obtain exact solutions of these subproblems. In this paper, we propose an inexact
version of the iterative procedure, where the inexactness in the inner subproblem
solution is controlled by a criterion that preserves the convergence of the Bregman
iteration and its features in image restoration problems. In particular, the method
allows to obtain accurate reconstructions also when only an overestimation of the
regularization parameter is known. The introduction of the inexactness in the iterative
scheme allows to address image reconstruction problems from data corrupted by
Poisson noise, exploiting the recent advances about specialized algorithms for the
numerical minimization of the generalized KullbackāLeibler divergence combined with
a regularization term. The results of several numerical experiments enable to evaluat
A Total Fractional-Order Variation Model for Image Restoration with Non-homogeneous Boundary Conditions and its Numerical Solution
To overcome the weakness of a total variation based model for image
restoration, various high order (typically second order) regularization models
have been proposed and studied recently. In this paper we analyze and test a
fractional-order derivative based total -order variation model, which
can outperform the currently popular high order regularization models. There
exist several previous works using total -order variations for image
restoration; however first no analysis is done yet and second all tested
formulations, differing from each other, utilize the zero Dirichlet boundary
conditions which are not realistic (while non-zero boundary conditions violate
definitions of fractional-order derivatives). This paper first reviews some
results of fractional-order derivatives and then analyzes the theoretical
properties of the proposed total -order variational model rigorously.
It then develops four algorithms for solving the variational problem, one based
on the variational Split-Bregman idea and three based on direct solution of the
discretise-optimization problem. Numerical experiments show that, in terms of
restoration quality and solution efficiency, the proposed model can produce
highly competitive results, for smooth images, to two established high order
models: the mean curvature and the total generalized variation.Comment: 26 page
An Iterative Shrinkage Approach to Total-Variation Image Restoration
The problem of restoration of digital images from their degraded measurements
plays a central role in a multitude of practically important applications. A
particularly challenging instance of this problem occurs in the case when the
degradation phenomenon is modeled by an ill-conditioned operator. In such a
case, the presence of noise makes it impossible to recover a valuable
approximation of the image of interest without using some a priori information
about its properties. Such a priori information is essential for image
restoration, rendering it stable and robust to noise. Particularly, if the
original image is known to be a piecewise smooth function, one of the standard
priors used in this case is defined by the Rudin-Osher-Fatemi model, which
results in total variation (TV) based image restoration. The current arsenal of
algorithms for TV-based image restoration is vast. In the present paper, a
different approach to the solution of the problem is proposed based on the
method of iterative shrinkage (aka iterated thresholding). In the proposed
method, the TV-based image restoration is performed through a recursive
application of two simple procedures, viz. linear filtering and soft
thresholding. Therefore, the method can be identified as belonging to the group
of first-order algorithms which are efficient in dealing with images of
relatively large sizes. Another valuable feature of the proposed method
consists in its working directly with the TV functional, rather then with its
smoothed versions. Moreover, the method provides a single solution for both
isotropic and anisotropic definitions of the TV functional, thereby
establishing a useful connection between the two formulae.Comment: The paper was submitted to the IEEE Transactions on Image Processing
on October 22nd, 200
- ā¦