3 research outputs found

    Sampling and Optimization on Convex Sets in Riemannian Manifolds of Non-Negative Curvature

    Full text link
    The Euclidean space notion of convex sets (and functions) generalizes to Riemannian manifolds in a natural sense and is called geodesic convexity. Extensively studied computational problems such as convex optimization and sampling in convex sets also have meaningful counterparts in the manifold setting. Geodesically convex optimization is a well-studied problem with ongoing research and considerable recent interest in machine learning and theoretical computer science. In this paper, we study sampling and convex optimization problems over manifolds of non-negative curvature proving polynomial running time in the dimension and other relevant parameters. Our algorithms assume a warm start. We first present a random walk based sampling algorithm and then combine it with simulated annealing for solving convex optimization problems. To our knowledge, these are the first algorithms in the general setting of positively curved manifolds with provable polynomial guarantees under reasonable assumptions, and the first study of the connection between sampling and optimization in this setting.Comment: Appeared at COLT 201

    Newton retraction as approximate geodesics on submanifolds

    Full text link
    Efficient approximation of geodesics is crucial for practical algorithms on manifolds. Here we introduce a class of retractions on submanifolds, induced by a foliation of the ambient manifold. They match the projective retraction to the third order and thus match the exponential map to the second order. In particular, we show that Newton retraction (NR) is always stabler than the popular approach known as oblique projection or orthographic retraction: per Kantorovich-type convergence theorems, the superlinear convergence regions of NR include those of the latter. We also show that NR always has a lower computational cost. The preferable properties of NR are useful for optimization, sampling, and many other statistical problems on manifolds.Comment: 9 pages, 2 figures, 1 tabl

    From Nesterov's Estimate Sequence to Riemannian Acceleration

    Full text link
    We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our result we revisit Nesterov's estimate sequence technique and develop an alternative analysis for it that may also be of independent interest. Then, we extend this analysis to the Riemannian setting, localizing the key difficulty due to non-Euclidean structure into a certain ``metric distortion.'' We control this distortion by developing a novel geometric inequality, which permits us to propose and analyze a Riemannian counterpart to Nesterov's accelerated gradient method.Comment: 30 page
    corecore