3 research outputs found
Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
In this paper, a new variant of accelerated gradient descent is proposed. The
pro-posed method does not require any information about the objective function,
usesexact line search for the practical accelerations of convergence, converges
accordingto the well-known lower bounds for both convex and non-convex
objective functions,possesses primal-dual properties and can be applied in the
non-euclidian set-up. Asfar as we know this is the rst such method possessing
all of the above properties atthe same time. We also present a universal
version of the method which is applicableto non-smooth problems. We demonstrate
how in practice one can efficiently use thecombination of line-search and
primal-duality by considering a convex optimizationproblem with a simple
structure (for example, linearly constrained)