17 research outputs found

    Stochastic gradient descent on Riemannian manifolds

    Full text link
    Stochastic gradient descent is a simple approach to find the local minima of a cost function whose evaluations are corrupted by noise. In this paper, we develop a procedure extending stochastic gradient descent algorithms to the case where the function is defined on a Riemannian manifold. We prove that, as in the Euclidian case, the gradient descent algorithm converges to a critical point of the cost function. The algorithm has numerous potential applications, and is illustrated here by four examples. In particular a novel gossip algorithm on the set of covariance matrices is derived and tested numerically.Comment: A slightly shorter version has been published in IEEE Transactions Automatic Contro

    Decentralized projected Riemannian gradient method for smooth optimization on compact submanifolds

    Full text link
    We consider the problem of decentralized nonconvex optimization over a compact submanifold, where each local agent's objective function defined by the local dataset is smooth. Leveraging the powerful tool of proximal smoothness, we establish local linear convergence of the projected gradient descent method with unit step size for solving the consensus problem over the compact manifold. This serves as the basis for analyzing decentralized algorithms on manifolds. Then, we propose two decentralized methods, namely the decentralized projected Riemannian gradient descent (DPRGD) and the decentralized projected Riemannian gradient tracking (DPRGT) methods. We establish their convergence rates of O(1/K)\mathcal{O}(1/\sqrt{K}) and O(1/K)\mathcal{O}(1/K), respectively, to reach a stationary point. To the best of our knowledge, DPRGT is the first decentralized algorithm to achieve exact convergence for solving decentralized optimization over a compact manifold. The key ingredients in the proof are the Lipschitz-type inequalities of the projection operator on the compact manifold and smooth functions on the manifold, which could be of independent interest. Finally, we demonstrate the effectiveness of our proposed methods compared to state-of-the-art ones through numerical experiments on eigenvalue problems and low-rank matrix completion.Comment: 32 page

    International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book

    Get PDF
    The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions. This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
    corecore