We consider the sequence acceleration problem for the alternating direction
method-of-multipliers (ADMM) applied to a class of equality-constrained
problems with strongly convex quadratic objectives, which frequently arise as
the Newton subproblem of interior-point methods. Within this context, the ADMM
update equations are linear, the iterates are confined within a Krylov
subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its
ability to accelerate convergence. The basic ADMM method solves a
κ-conditioned problem in O(κ​) iterations. We give
theoretical justification and numerical evidence that the GMRES-accelerated
variant consistently solves the same problem in O(κ1/4) iterations
for an order-of-magnitude reduction in iterations, despite a worst-case bound
of O(κ​) iterations. The method is shown to be competitive against
standard preconditioned Krylov subspace methods for saddle-point problems. The
method is embedded within SeDuMi, a popular open-source solver for conic
optimization written in MATLAB, and used to solve many large-scale semidefinite
programs with error that decreases like O(1/k2), instead of O(1/k),
where k is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on
Optimization (SIOPT