5,461 research outputs found
Convergence Rates with Inexact Non-expansive Operators
In this paper, we present a convergence rate analysis for the inexact
Krasnosel'skii-Mann iteration built from nonexpansive operators. Our results
include two main parts: we first establish global pointwise and ergodic
iteration-complexity bounds, and then, under a metric subregularity assumption,
we establish local linear convergence for the distance of the iterates to the
set of fixed points. The obtained iteration-complexity result can be applied to
analyze the convergence rate of various monotone operator splitting methods in
the literature, including the Forward-Backward, the Generalized
Forward-Backward, Douglas-Rachford, alternating direction method of multipliers
(ADMM) and Primal-Dual splitting methods. For these methods, we also develop
easily verifiable termination criteria for finding an approximate solution,
which can be seen as a generalization of the termination criterion for the
classical gradient descent method. We finally develop a parallel analysis for
the non-stationary Krasnosel'skii-Mann iteration. The usefulness of our results
is illustrated by applying them to a large class of structured monotone
inclusion and convex optimization problems. Experiments on some large scale
inverse problems in signal and image processing problems are shown.Comment: This is an extended version of the work presented in
http://arxiv.org/abs/1310.6636, and is accepted by the Mathematical
Programmin
GMRES-Accelerated ADMM for Quadratic Objectives
We consider the sequence acceleration problem for the alternating direction
method-of-multipliers (ADMM) applied to a class of equality-constrained
problems with strongly convex quadratic objectives, which frequently arise as
the Newton subproblem of interior-point methods. Within this context, the ADMM
update equations are linear, the iterates are confined within a Krylov
subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its
ability to accelerate convergence. The basic ADMM method solves a
-conditioned problem in iterations. We give
theoretical justification and numerical evidence that the GMRES-accelerated
variant consistently solves the same problem in iterations
for an order-of-magnitude reduction in iterations, despite a worst-case bound
of iterations. The method is shown to be competitive against
standard preconditioned Krylov subspace methods for saddle-point problems. The
method is embedded within SeDuMi, a popular open-source solver for conic
optimization written in MATLAB, and used to solve many large-scale semidefinite
programs with error that decreases like , instead of ,
where is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on
Optimization (SIOPT
- …