Decentralized projected Riemannian gradient method for smooth optimization on compact submanifolds

Abstract

We consider the problem of decentralized nonconvex optimization over a compact submanifold, where each local agent's objective function defined by the local dataset is smooth. Leveraging the powerful tool of proximal smoothness, we establish local linear convergence of the projected gradient descent method with unit step size for solving the consensus problem over the compact manifold. This serves as the basis for analyzing decentralized algorithms on manifolds. Then, we propose two decentralized methods, namely the decentralized projected Riemannian gradient descent (DPRGD) and the decentralized projected Riemannian gradient tracking (DPRGT) methods. We establish their convergence rates of O(1/K)\mathcal{O}(1/\sqrt{K}) and O(1/K)\mathcal{O}(1/K), respectively, to reach a stationary point. To the best of our knowledge, DPRGT is the first decentralized algorithm to achieve exact convergence for solving decentralized optimization over a compact manifold. The key ingredients in the proof are the Lipschitz-type inequalities of the projection operator on the compact manifold and smooth functions on the manifold, which could be of independent interest. Finally, we demonstrate the effectiveness of our proposed methods compared to state-of-the-art ones through numerical experiments on eigenvalue problems and low-rank matrix completion.Comment: 32 page

    Similar works

    Full text

    thumbnail-image

    Available Versions