1,254 research outputs found
Convergence Rates with Inexact Non-expansive Operators
In this paper, we present a convergence rate analysis for the inexact
Krasnosel'skii-Mann iteration built from nonexpansive operators. Our results
include two main parts: we first establish global pointwise and ergodic
iteration-complexity bounds, and then, under a metric subregularity assumption,
we establish local linear convergence for the distance of the iterates to the
set of fixed points. The obtained iteration-complexity result can be applied to
analyze the convergence rate of various monotone operator splitting methods in
the literature, including the Forward-Backward, the Generalized
Forward-Backward, Douglas-Rachford, alternating direction method of multipliers
(ADMM) and Primal-Dual splitting methods. For these methods, we also develop
easily verifiable termination criteria for finding an approximate solution,
which can be seen as a generalization of the termination criterion for the
classical gradient descent method. We finally develop a parallel analysis for
the non-stationary Krasnosel'skii-Mann iteration. The usefulness of our results
is illustrated by applying them to a large class of structured monotone
inclusion and convex optimization problems. Experiments on some large scale
inverse problems in signal and image processing problems are shown.Comment: This is an extended version of the work presented in
http://arxiv.org/abs/1310.6636, and is accepted by the Mathematical
Programmin
Tight Global Linear Convergence Rate Bounds for Douglas-Rachford Splitting
Recently, several authors have shown local and global convergence rate
results for Douglas-Rachford splitting under strong monotonicity, Lipschitz
continuity, and cocoercivity assumptions. Most of these focus on the convex
optimization setting. In the more general monotone inclusion setting, Lions and
Mercier showed a linear convergence rate bound under the assumption that one of
the two operators is strongly monotone and Lipschitz continuous. We show that
this bound is not tight, meaning that no problem from the considered class
converges exactly with that rate. In this paper, we present tight global linear
convergence rate bounds for that class of problems. We also provide tight
linear convergence rate bounds under the assumptions that one of the operators
is strongly monotone and cocoercive, and that one of the operators is strongly
monotone and the other is cocoercive. All our linear convergence results are
obtained by proving the stronger property that the Douglas-Rachford operator is
contractive
- …