29 research outputs found

    The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions

    Get PDF
    In this paper, we study the convergence rate of gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an LL-smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases. In addition, based on the bound, we derive an optimal step length

    Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation

    Get PDF
    We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori and Teboulle [Mathematical Programming, 145(1-2):451-482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, 11(7), 1185-1199, 2017]. To illustrate the applicability of the tools, we demonstrate a novel complexity analysis of short step interior point methods using inexact search directions. As an example in this framework, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [PMLR, 48:2520-2528, 2016].Comment: 22 pages, 1 figure. Title of earlier version was "Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation
    corecore