In this work, we study optimization problems of the form minxmaxyf(x,y), where f(x,y) is defined on a product Riemannian manifold M×N and is μx-strongly geodesically convex (g-convex) in
x and μy-strongly g-concave in y, for μx,μy≥0. We design
accelerated methods when f is (Lx,Ly,Lxy)-smooth and M,
N are Hadamard. To that aim we introduce new g-convex optimization
results, of independent interest: we show global linear convergence for
metric-projected Riemannian gradient descent and improve existing accelerated
methods by reducing geometric constants. Additionally, we complete the analysis
of two previous works applying to the Riemannian min-max case by removing an
assumption about iterates staying in a pre-specified compact set.Comment: added weakly-convex analysis, and some remark