808 research outputs found

    Parametric Regression on the Grassmannian

    Get PDF
    We address the problem of fitting parametric curves on the Grassmann manifold for the purpose of intrinsic parametric regression. As customary in the literature, we start from the energy minimization formulation of linear least-squares in Euclidean spaces and generalize this concept to general nonflat Riemannian manifolds, following an optimal-control point of view. We then specialize this idea to the Grassmann manifold and demonstrate that it yields a simple, extensible and easy-to-implement solution to the parametric regression problem. In fact, it allows us to extend the basic geodesic model to (1) a time-warped variant and (2) cubic splines. We demonstrate the utility of the proposed solution on different vision problems, such as shape regression as a function of age, traffic-speed estimation and crowd-counting from surveillance video clips. Most notably, these problems can be conveniently solved within the same framework without any specifically-tailored steps along the processing pipeline.Comment: 14 pages, 11 figure

    Barycentres and Hurricane Trajectories

    Full text link
    The use of barycentres in data analysis is illustrated, using as example a dataset of hurricane trajectories.Comment: 19 pages, 7 figures. Contribution to Mardia festschrift "Geometry Driven Statistics". Version 2: added further reference to HURDAT2 data format. Version 3: various minor corrections, and added dedication to Mardi

    Dissipative numerical schemes on Riemannian manifolds with applications to gradient flows

    Full text link
    This paper concerns an extension of discrete gradient methods to finite-dimensional Riemannian manifolds termed discrete Riemannian gradients, and their application to dissipative ordinary differential equations. This includes Riemannian gradient flow systems which occur naturally in optimization problems. The Itoh--Abe discrete gradient is formulated and applied to gradient systems, yielding a derivative-free optimization algorithm. The algorithm is tested on two eigenvalue problems and two problems from manifold valued imaging: InSAR denoising and DTI denoising.Comment: Post-revision version. To appear in SIAM Journal on Scientific Computin

    A Second Order Non-Smooth Variational Model for Restoring Manifold-Valued Images

    Full text link
    We introduce a new non-smooth variational model for the restoration of manifold-valued data which includes second order differences in the regularization term. While such models were successfully applied for real-valued images, we introduce the second order difference and the corresponding variational models for manifold data, which up to now only existed for cyclic data. The approach requires a combination of techniques from numerical analysis, convex optimization and differential geometry. First, we establish a suitable definition of absolute second order differences for signals and images with values in a manifold. Employing this definition, we introduce a variational denoising model based on first and second order differences in the manifold setup. In order to minimize the corresponding functional, we develop an algorithm using an inexact cyclic proximal point algorithm. We propose an efficient strategy for the computation of the corresponding proximal mappings in symmetric spaces utilizing the machinery of Jacobi fields. For the n-sphere and the manifold of symmetric positive definite matrices, we demonstrate the performance of our algorithm in practice. We prove the convergence of the proposed exact and inexact variant of the cyclic proximal point algorithm in Hadamard spaces. These results which are of interest on its own include, e.g., the manifold of symmetric positive definite matrices

    Riemannian cubics on the group of diffeomorphisms and the Fisher-Rao metric

    Get PDF
    We study a second-order variational problem on the group of diffeomorphisms of the interval [0, 1] endowed with a right-invariant Sobolev metric of order 2, which consists in the minimization of the acceleration. We compute the relaxation of the problem which involves the so-called Fisher-Rao functional a convex functional on the space of measures. This relaxation enables the derivation of several optimality conditions and, in particular, a sufficient condition which guarantees that a given path of the initial problem is also a minimizer of the relaxed one. This sufficient condition is related to the existence of a solution to a Riccati equation involving the path acceleration.Comment: 34 pages, comments welcom
    corecore