An accelerated first-order regularized momentum descent ascent algorithm for stochastic nonconvex-concave minimax problems

Abstract

Stochastic nonconvex minimax problems have attracted wide attention in machine learning, signal processing and many other fields in recent years. In this paper, we propose an accelerated first-order regularized momentum descent ascent algorithm (FORMDA) for solving stochastic nonconvex-concave minimax problems. The iteration complexity of the algorithm is proved to be O~(ε−6.5)\tilde{\mathcal{O}}(\varepsilon ^{-6.5}) to obtain an ε\varepsilon-stationary point, which achieves the best-known complexity bound for single-loop algorithms to solve the stochastic nonconvex-concave minimax problems under the stationarity of the objective function

    Similar works

    Full text

    thumbnail-image

    Available Versions