We consider the minimization of a cost function f on a manifold M using
Riemannian gradient descent and Riemannian trust regions (RTR). We focus on
satisfying necessary optimality conditions within a tolerance ε.
Specifically, we show that, under Lipschitz-type assumptions on the pullbacks
of f to the tangent spaces of M, both of these algorithms produce points
with Riemannian gradient smaller than ε in O(1/ε2)
iterations. Furthermore, RTR returns a point where also the Riemannian
Hessian's least eigenvalue is larger than −ε in
O(1/ε3) iterations. There are no assumptions on initialization.
The rates match their (sharp) unconstrained counterparts as a function of the
accuracy ε (up to constants) and hence are sharp in that sense.
These are the first deterministic results for global rates of convergence to
approximate first- and second-order Karush-Kuhn-Tucker points on manifolds.
They apply in particular for optimization constrained to compact submanifolds
of Rn, under simpler assumptions.Comment: 33 pages, IMA Journal of Numerical Analysis, 201