175 research outputs found
Fast Krasnosel'skii-Mann algorithm with a convergence rate of the fixed point iteration of
The Krasnosel'skii-Mann (KM) algorithm is the most fundamental iterative
scheme designed to find a fixed point of an averaged operator in the framework
of a real Hilbert space, since it lies at the heart of various numerical
algorithms for solving monotone inclusions and convex optimization problems. We
enhance the Krasnosel'skii-Mann algorithm with Nesterov's momentum updates and
show that the resulting numerical method exhibits a convergence rate for the
fixed point residual of while preserving the weak convergence of the
iterates to a fixed point of the operator. Numerical experiments illustrate the
superiority of the resulting so-called Fast KM algorithm over various fixed
point iterative schemes, and also its oscillatory behavior, which is a specific
of Nesterov's momentum optimization algorithms
Application of Tikhonov Regularized Methods to Image Deblurring Problem
We consider the monotone inclusion problems in real Hilbert spaces. Proximal
splitting algorithms are very popular technique to solve it and generally
achieve weak convergence under mild assumptions. Researchers assume strong
conditions like strong convexity or strong monotonicity on the considered
operators to prove strong convergence of the algorithms. Mann iteration method
and normal S-iteration method are popular methods to solve fixed point
problems. We propose a new common fixed point algorithm based on normal
S-iteration method {using Tikhonov regularization }to find common fixed point
of nonexpansive operators and prove strong convergence of the generated
sequence to the set of common fixed points without assuming strong convexity
and strong monotonicity. Based on the proposed fixed point algorithm, we
propose a forward-backward-type algorithm and a Douglas-Rachford algorithm in
connection with Tikhonov regularization to find the solution of monotone
inclusion problems. Further, we consider the complexly structured monotone
inclusion problems which are very popular these days. We also propose a
strongly convergent forward-backward-type primal-dual algorithm and a
Douglas-Rachford-type primal-dual algorithm to solve the monotone inclusion
problems. Finally, we conduct a numerical experiment to solve image deblurring
problems
First order algorithms in variational image processing
Variational methods in imaging are nowadays developing towards a quite
universal and flexible tool, allowing for highly successful approaches on tasks
like denoising, deblurring, inpainting, segmentation, super-resolution,
disparity, and optical flow estimation. The overall structure of such
approaches is of the form ; where the functional is a data fidelity term also
depending on some input data and measuring the deviation of from such
and is a regularization functional. Moreover is a (often linear)
forward operator modeling the dependence of data on an underlying image, and
is a positive regularization parameter. While is often
smooth and (strictly) convex, the current practice almost exclusively uses
nonsmooth regularization functionals. The majority of successful techniques is
using nonsmooth and convex functionals like the total variation and
generalizations thereof or -norms of coefficients arising from scalar
products with some frame system. The efficient solution of such variational
problems in imaging demands for appropriate algorithms. Taking into account the
specific structure as a sum of two very different terms to be minimized,
splitting algorithms are a quite canonical choice. Consequently this field has
revived the interest in techniques like operator splittings or augmented
Lagrangians. Here we shall provide an overview of methods currently developed
and recent results as well as some computational studies providing a comparison
of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure
- …