9,420 research outputs found
A Variational Perspective on Accelerated Methods in Optimization
Accelerated gradient methods play a central role in optimization, achieving
optimal rates in many settings. While many generalizations and extensions of
Nesterov's original acceleration method have been proposed, it is not yet clear
what is the natural scope of the acceleration concept. In this paper, we study
accelerated methods from a continuous-time perspective. We show that there is a
Lagrangian functional that we call the \emph{Bregman Lagrangian} which
generates a large class of accelerated methods in continuous time, including
(but not limited to) accelerated gradient descent, its non-Euclidean extension,
and accelerated higher-order gradient methods. We show that the continuous-time
limit of all of these methods correspond to traveling the same curve in
spacetime at different speeds. From this perspective, Nesterov's technique and
many of its generalizations can be viewed as a systematic way to go from the
continuous-time curves generated by the Bregman Lagrangian to a family of
discrete-time accelerated algorithms.Comment: 38 pages. Subsumes an earlier working draft arXiv:1509.0361
- …