18 research outputs found

    Finding geodesics joining given points

    Get PDF
    Finding a geodesic joining two given points in a complete path-connected Riemannian manifold requires much more effort than determining a geodesic from initial data. This is because it is much harder to solve boundary value problems than initial value problems. Shooting methods attempt to solve boundary value problems by solving a sequence of initial value problems, and usually need a good initial guess to succeed. The present paper finds a geodesic γ: [0 , 1] → M on the Riemannian manifold M with γ(0) = x0 and γ(1) = x1 by dividing the interval [0,1] into several sub-intervals, preferably just enough to enable a good initial guess for the boundary value problem on each subinterval. Then a geodesic joining consecutive endpoints (local junctions) is found by single shooting. Our algorithm then adjusts the junctions, either (1) by minimizing the total squared norm of the differences between associated geodesic velocities using Riemannian gradient descent, or (2) by solving a nonlinear system of equations using Newton’s method. Our algorithm is compared with the known leapfrog algorithm by numerical experiments on a 2-dimensional ellipsoid Ell(2) and on a left-invariant 3-dimensional special orthogonal group SO(3). We find Newton’s method (2) converges much faster than leapfrog when more junctions are needed, and that a good initial guess can be found for (2) by starting with Riemannian gradient descent method (1)

    Convergence analysis of leapfrog for geodesics

    Get PDF
    Geodesics are of fundamental interest in mathematics, physics, computer science, and many other subjects. The so-called leapfrog algorithm was proposed in [L. Noakes, J. Aust. Math. Soc., 65 (1998), pp. 37-50] (but not named there as such) to find geodesics joining two given points x0 and x1 on a path-connected complete Riemannian manifold. The basic idea is to choose some junctions between x0 and x1 that can be joined by geodesics locally and then adjust these junctions. It was proved that the sequence of piecewise geodesics { k}k ≥ 1 generated by this algorithm converges to a geodesic joining x0 and x1. The present paper investigates leapfrog\u27s convergence rate i,n of ith junction depending on the manifold M. A relationship is found with the maximal root n of a polynomial of degree n-3, where n (n \u3e 3) is the number of geodesic segments. That is, the minimal i,n is upper bounded by n(1 + c+), where c+ is a sufficiently small positive constant depending on the curvature of the manifold M. Moreover, we show that n increases as n increases. These results are illustrated by implementing leapfrog on two Riemannian manifolds: the unit 2-sphere and the manifold of all 2 × 2 symmetric positive definite matrices

    Multi-objective variational curves

    Full text link
    Riemannian cubics in tension are critical points of the linear combination of two objective functionals, namely the squared norms of the velocity and acceleration of a curve on a Riemannian manifold. We view this variational problem of finding a curve as a multi-objective optimization problem and construct the Pareto fronts for some given instances where the manifold is a sphere and where the manifold is a torus. The Pareto front for the curves on the torus turns out to be particularly interesting: the front is disconnected and it reveals two distinct Riemannian cubics with the same boundary data, which is the first known nontrivial instance of this kind. We also discuss some convexity conditions involving the Pareto fronts for curves on general Riemannian manifolds

    Unsupervised Learning for Robust Fitting:A Reinforcement Learning Approach

    Full text link
    Robust model fitting is a core algorithm in a large number of computer vision applications. Solving this problem efficiently for datasets highly contaminated with outliers is, however, still challenging due to the underlying computational complexity. Recent literature has focused on learning-based algorithms. However, most approaches are supervised which require a large amount of labelled training data. In this paper, we introduce a novel unsupervised learning framework that learns to directly solve robust model fitting. Unlike other methods, our work is agnostic to the underlying input features, and can be easily generalized to a wide variety of LP-type problems with quasi-convex residuals. We empirically show that our method outperforms existing unsupervised learning approaches, and achieves competitive results compared to traditional methods on several important computer vision problems.Comment: The preprint of paper accepted to CVPR 202

    Finding extremals of Lagrangian actions

    No full text
    Given a smooth m-manifold M, a smooth Lagrangian L:TM→R and endpoints x0,xT∈M, we look for an extremal x:[0,T]→M of the action ∫0TL(x(t),ẋ(t))dt satisfying x(0)=x0 and x(T)=xT. When interpolating between endpoints, this amounts to a 2-point boundary value problem for the Euler–Lagrange equation. Single or multiple shooting is one of the most popular methods to solve boundary value problems, but the efficiency of shooting and the quality of solutions depends heavily on initial guesses. In the present paper, by dividing the interval [0,T] into several sub-intervals, on which extremals can be found efficiently by shooting when good initial guesses are available from the geometry of a variational problem, we then adjust all junctions by finding zeros of vector fields associated with the velocities at junctions with Newton\u27s method. We discuss the cases where L is the difference between kinetic energy and potential, M is a hypersurface in Euclidean space, or M is a Lie group. We make some comparisons in numerical experiments for a double pendulum, for obstacle avoidance by a moving particle on the 2-sphere, and for obstacle avoidance by a planar rigid body

    A Statistical Cohomogeneity One Metric on the Upper Plane with Constant Negative Curvature

    No full text
    we analyze the geometrical structures of statistical manifold S consisting of all the wrapped Cauchy distributions. We prove that S is a simply connected manifold with constant negative curvature = −2. However, it is not isometric to the hyperbolic space because S is noncomplete. In fact, S is approved to be a cohomogeneity one manifold. Finally, we use several tricks to get the geodesics and explore the divergence performance of them by investigating the Jacobi vector field
    corecore