671 research outputs found

    A variational model for data fitting on manifolds by minimizing the acceleration of a B\'ezier curve

    Get PDF
    We derive a variational model to fit a composite B\'ezier curve to a set of data points on a Riemannian manifold. The resulting curve is obtained in such a way that its mean squared acceleration is minimal in addition to remaining close the data points. We approximate the acceleration by discretizing the squared second order derivative along the curve. We derive a closed-form, numerically stable and efficient algorithm to compute the gradient of a B\'ezier curve on manifolds with respect to its control points, expressed as a concatenation of so-called adjoint Jacobi fields. Several examples illustrate the capabilites and validity of this approach both for interpolation and approximation. The examples also illustrate that the approach outperforms previous works tackling this problem

    Warped Riemannian metrics for location-scale models

    Full text link
    The present paper shows that warped Riemannian metrics, a class of Riemannian metrics which play a prominent role in Riemannian geometry, are also of fundamental importance in information geometry. Precisely, the paper features a new theorem, which states that the Rao-Fisher information metric of any location-scale model, defined on a Riemannian manifold, is a warped Riemannian metric, whenever this model is invariant under the action of some Lie group. This theorem is a valuable tool in finding the expression of the Rao-Fisher information metric of location-scale models defined on high-dimensional Riemannian manifolds. Indeed, a warped Riemannian metric is fully determined by only two functions of a single variable, irrespective of the dimension of the underlying Riemannian manifold. Starting from this theorem, several original contributions are made. The expression of the Rao-Fisher information metric of the Riemannian Gaussian model is provided, for the first time in the literature. A generalised definition of the Mahalanobis distance is introduced, which is applicable to any location-scale model defined on a Riemannian manifold. The solution of the geodesic equation is obtained, for any Rao-Fisher information metric defined in terms of warped Riemannian metrics. Finally, using a mixture of analytical and numerical computations, it is shown that the parameter space of the von Mises-Fisher model of nn-dimensional directional data, when equipped with its Rao-Fisher information metric, becomes a Hadamard manifold, a simply-connected complete Riemannian manifold of negative sectional curvature, for n=2,…,8n = 2,\ldots,8. Hopefully, in upcoming work, this will be proved for any value of nn.Comment: first version, before submissio

    The geometry of nonlinear least squares with applications to sloppy models and optimization

    Full text link
    Parameter estimation by nonlinear least squares minimization is a common problem with an elegant geometric interpretation: the possible parameter values of a model induce a manifold in the space of data predictions. The minimization problem is then to find the point on the manifold closest to the data. We show that the model manifolds of a large class of models, known as sloppy models, have many universal features; they are characterized by a geometric series of widths, extrinsic curvatures, and parameter-effects curvatures. A number of common difficulties in optimizing least squares problems are due to this common structure. First, algorithms tend to run into the boundaries of the model manifold, causing parameters to diverge or become unphysical. We introduce the model graph as an extension of the model manifold to remedy this problem. We argue that appropriate priors can remove the boundaries and improve convergence rates. We show that typical fits will have many evaporated parameters. Second, bare model parameters are usually ill-suited to describing model behavior; cost contours in parameter space tend to form hierarchies of plateaus and canyons. Geometrically, we understand this inconvenient parametrization as an extremely skewed coordinate basis and show that it induces a large parameter-effects curvature on the manifold. Using coordinates based on geodesic motion, these narrow canyons are transformed in many cases into a single quadratic, isotropic basin. We interpret the modified Gauss-Newton and Levenberg-Marquardt fitting algorithms as an Euler approximation to geodesic motion in these natural coordinates on the model manifold and the model graph respectively. By adding a geodesic acceleration adjustment to these algorithms, we alleviate the difficulties from parameter-effects curvature, improving both efficiency and success rates at finding good fits.Comment: 40 pages, 29 Figure

    Efficient Rank Reduction of Correlation Matrices

    Get PDF
    Geometric optimisation algorithms are developed that efficiently find the nearest low-rank correlation matrix. We show, in numerical tests, that our methods compare favourably to the existing methods in the literature. The connection with the Lagrange multiplier method is established, along with an identification of whether a local minimum is a global minimum. An additional benefit of the geometric approach is that any weighted norm can be applied. The problem of finding the nearest low-rank correlation matrix occurs as part of the calibration of multi-factor interest rate market models to correlation.Comment: First version: 20 pages, 4 figures Second version [changed content]: 21 pages, 6 figure

    On the Shape of Things: From holography to elastica

    Get PDF
    We explore the question of which shape a manifold is compelled to take when immersed in another one, provided it must be the extremum of some functional. We consider a family of functionals which depend quadratically on the extrinsic curvatures and on projections of the ambient curvatures. These functionals capture a number of physical setups ranging from holography to the study of membranes and elastica. We present a detailed derivation of the equations of motion, known as the shape equations, placing particular emphasis on the issue of gauge freedom in the choice of normal frame. We apply these equations to the particular case of holographic entanglement entropy for higher curvature three dimensional gravity and find new classes of entangling curves. In particular, we discuss the case of New Massive Gravity where we show that non-geodesic entangling curves have always a smaller on-shell value of the entropy functional. Then we apply this formalism to the computation of the entanglement entropy for dual logarithmic CFTs. Nevertheless, the correct value for the entanglement entropy is provided by geodesics. Then, we discuss the importance of these equations in the context of classical elastica and comment on terms that break gauge invariance.Comment: 54 pages, 8 figures. Significantly improved version, accepted for publication in Annals of Physics. New section on logarithmic CFTs. Detailed derivation of the shape equations added in appendix B. Typos corrected, clarifications adde

    Representation Learning via Manifold Flattening and Reconstruction

    Full text link
    This work proposes an algorithm for explicitly constructing a pair of neural networks that linearize and reconstruct an embedded submanifold, from finite samples of this manifold. Our such-generated neural networks, called Flattening Networks (FlatNet), are theoretically interpretable, computationally feasible at scale, and generalize well to test data, a balance not typically found in manifold-based learning methods. We present empirical results and comparisons to other models on synthetic high-dimensional manifold data and 2D image data. Our code is publicly available.Comment: 44 pages, 19 figure
    • …
    corecore