122 research outputs found
Sufficient conditions for non-asymptotic convergence of Riemannian optimisation methods
Motivated by energy based analyses for descent methods in the Euclidean
setting, we investigate a generalisation of such analyses for descent methods
over Riemannian manifolds. In doing so, we find that it is possible to derive
curvature-free guarantees for such descent methods. This also enables us to
give the first known guarantees for a Riemannian cubic-regularised Newton
algorithm over -convex functions, which extends the guarantees by Agarwal et
al [2021] for an adaptive Riemannian cubic-regularised Newton algorithm over
general non-convex functions. This analysis leads us to study acceleration of
Riemannian gradient descent in the -convex setting, and we improve on an
existing result by Alimisis et al [2021], albeit with a curvature-dependent
rate. Finally, extending the analysis by Ahn and Sra [2020], we attempt to
provide some sufficient conditions for the acceleration of Riemannian descent
methods in the strongly geodesically convex setting.Comment: Paper accepted at the OPT-ML Workshop, NeurIPS 202
Riemannian Acceleration with Preconditioning for symmetric eigenvalue problems
In this paper, we propose a Riemannian Acceleration with Preconditioning
(RAP) for symmetric eigenvalue problems, which is one of the most important
geodesically convex optimization problem on Riemannian manifold, and obtain the
acceleration. Firstly, the preconditioning for symmetric eigenvalue problems
from the Riemannian manifold viewpoint is discussed. In order to obtain the
local geodesic convexity, we develop the leading angle to measure the quality
of the preconditioner for symmetric eigenvalue problems. A new Riemannian
acceleration, called Locally Optimal Riemannian Accelerated Gradient (LORAG)
method, is proposed to overcome the local geodesic convexity for symmetric
eigenvalue problems. With similar techniques for RAGD and analysis of local
convex optimization in Euclidean space, we analyze the convergence of LORAG.
Incorporating the local geodesic convexity of symmetric eigenvalue problems
under preconditioning with the LORAG, we propose the Riemannian Acceleration
with Preconditioning (RAP) and prove its acceleration. Additionally, when the
Schwarz preconditioner, especially the overlapping or non-overlapping domain
decomposition method, is applied for elliptic eigenvalue problems, we also
obtain the rate of convergence as , where is a constant
independent of the mesh sizes and the eigenvalue gap,
, is
the parameter from the stable decomposition, and
are the smallest two eigenvalues of the elliptic operator. Numerical results
show the power of Riemannian acceleration and preconditioning.Comment: Due to the limit in abstract of arXiv, the abstract here is shorter
than in PD
- …