Lower-bound analyses for nonconvex strongly-concave minimax optimization
problems have shown that stochastic first-order algorithms require at least
O(ε−4) oracle complexity to find an
ε-stationary point. Some works indicate that this complexity can be
improved to O(ε−3) when the loss gradient is Lipschitz
continuous. The question of achieving enhanced convergence rates under distinct
conditions, remains unresolved. In this work, we address this question for
optimization problems that are nonconvex in the minimization variable and
strongly concave or Polyak-Lojasiewicz (PL) in the maximization variable. We
introduce novel bias-corrected momentum algorithms utilizing efficient
Hessian-vector products. We establish convergence conditions and demonstrate a
lower iteration complexity of O(ε−3) for the proposed
algorithms. The effectiveness of the method is validated through applications
to robust logistic regression using real-world datasets