14 research outputs found
Design of First-Order Optimization Algorithms via Sum-of-Squares Programming
In this paper, we propose a framework based on sum-of-squares programming to
design iterative first-order optimization algorithms for smooth and strongly
convex problems. Our starting point is to develop a polynomial matrix
inequality as a sufficient condition for exponential convergence of the
algorithm. The entries of this matrix are polynomial functions of the unknown
parameters (exponential decay rate, stepsize, momentum coefficient, etc.). We
then formulate a polynomial optimization, in which the objective is to optimize
the exponential decay rate over the parameters of the algorithm. Finally, we
use sum-of-squares programming as a tractable relaxation of the proposed
polynomial optimization problem. We illustrate the utility of the proposed
framework by designing a first-order algorithm that shares the same structure
as Nesterov's accelerated gradient method