research

First-order methods of smooth convex optimization with inexact oracle

Abstract

In this paper, we analyze different first-order methods of smooth convex optimization employing inexact first-order information. We introduce the notion of an approximate first-order oracle. The list of examples of such an oracle includes smoothing technique, Moreau-Yosida regularization, Modified Lagrangians, and many others. For different methods, we derive complexity estimates and study the dependence of the desired accuracy in the objective function and the accuracy of the oracle. It appears that in inexact case, the superiority of the fast gradient methods over the classical ones is not anymore absolute. Contrary to the simple gradient schemes, fast gradient methods necessarily suffer from accumulation of errors. Thus, the choice of the method depends both on desired accuracy and accuracy of the oracle. We present applications of our results to smooth convex-concave saddle point problems, to the analysis of Modified Lagrangians, to the prox-method, and some others.smooth convex optimization, first-order methods, inexact oracle, gradient methods, fast gradient methods, complexity bounds

    Similar works