48 research outputs found
GMRES-Accelerated ADMM for Quadratic Objectives
We consider the sequence acceleration problem for the alternating direction
method-of-multipliers (ADMM) applied to a class of equality-constrained
problems with strongly convex quadratic objectives, which frequently arise as
the Newton subproblem of interior-point methods. Within this context, the ADMM
update equations are linear, the iterates are confined within a Krylov
subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its
ability to accelerate convergence. The basic ADMM method solves a
-conditioned problem in iterations. We give
theoretical justification and numerical evidence that the GMRES-accelerated
variant consistently solves the same problem in iterations
for an order-of-magnitude reduction in iterations, despite a worst-case bound
of iterations. The method is shown to be competitive against
standard preconditioned Krylov subspace methods for saddle-point problems. The
method is embedded within SeDuMi, a popular open-source solver for conic
optimization written in MATLAB, and used to solve many large-scale semidefinite
programs with error that decreases like , instead of ,
where is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on
Optimization (SIOPT
Ultrasparse Ultrasparsifiers and Faster Laplacian System Solvers
In this paper we provide an -expected time algorithm for solving Laplacian systems on
-node -edge graphs, improving improving upon the previous best expected
runtime of achieved
by (Cohen, Kyng, Miller, Pachocki, Peng, Rao, Xu 2014). To obtain this result
we provide efficient constructions of -stretch graph approximations
with improved stretch and sparsity bounds. Additionally, as motivation for this
work, we show that for every set of vectors in (not just those
induced by graphs) and all there exist an ultra-sparsifiers with re-weighted vectors of relative condition number at most . For
small , this improves upon the previous best known multiplicative factor of
, which is only known for the graph case.Comment: 52 pages, comments welcome
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
Inexact Interior-Point Methods for Large Scale Linear and Convex Quadratic Semidefinite Programming
Ph.DDOCTOR OF PHILOSOPH