993 research outputs found
Riemannian Metric and Geometric Mean for Positive Semidefinite Matrices of Fixed Rank
This paper introduces a new metric and mean on the set of positive
semidefinite matrices of fixed-rank. The proposed metric is derived from a
well-chosen Riemannian quotient geometry that generalizes the reductive
geometry of the positive cone and the associated natural metric. The resulting
Riemannian space has strong geometrical properties: it is geodesically
complete, and the metric is invariant with respect to all transformations that
preserve angles (orthogonal transformations, scalings, and pseudoinversion). A
meaningful approximation of the associated Riemannian distance is proposed,
that can be efficiently numerically computed via a simple algorithm based on
SVD. The induced mean preserves the rank, possesses the most desirable
characteristics of a geometric mean, and is easy to compute.Comment: the present version is very close to the published one. It contains
some corrections with respect to the previous arxiv submssio
Rank-preserving geometric means of positive semi-definite matrices
The generalization of the geometric mean of positive scalars to positive
definite matrices has attracted considerable attention since the seminal work
of Ando. The paper generalizes this framework of matrix means by proposing the
definition of a rank-preserving mean for two or an arbitrary number of positive
semi-definite matrices of fixed rank. The proposed mean is shown to be
geometric in that it satisfies all the expected properties of a rank-preserving
geometric mean. The work is motivated by operations on low-rank approximations
of positive definite matrices in high-dimensional spaces.Comment: To appear in Linear Algebra and its Application
Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
The paper addresses the problem of learning a regression model parameterized
by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear
nature of the search space and on scalability to high-dimensional problems. The
mathematical developments rely on the theory of gradient descent algorithms
adapted to the Riemannian geometry that underlies the set of fixed-rank
positive semidefinite matrices. In contrast with previous contributions in the
literature, no restrictions are imposed on the range space of the learned
matrix. The resulting algorithms maintain a linear complexity in the problem
size and enjoy important invariance properties. We apply the proposed
algorithms to the problem of learning a distance function parameterized by a
positive semidefinite matrix. Good performance is observed on classical
benchmarks
The geometry of low-rank Kalman filters
An important property of the Kalman filter is that the underlying Riccati
flow is a contraction for the natural metric of the cone of symmetric positive
definite matrices. The present paper studies the geometry of a low-rank version
of the Kalman filter. The underlying Riccati flow evolves on the manifold of
fixed rank symmetric positive semidefinite matrices. Contraction properties of
the low-rank flow are studied by means of a suitable metric recently introduced
by the authors.Comment: Final version published in Matrix Information Geometry, pp53-68,
Springer Verlag, 201
Building Deep Networks on Grassmann Manifolds
Learning representations on Grassmann manifolds is popular in quite a few
visual recognition tasks. In order to enable deep learning on Grassmann
manifolds, this paper proposes a deep network architecture by generalizing the
Euclidean network paradigm to Grassmann manifolds. In particular, we design
full rank mapping layers to transform input Grassmannian data to more desirable
ones, exploit re-orthonormalization layers to normalize the resulting matrices,
study projection pooling layers to reduce the model complexity in the
Grassmannian context, and devise projection mapping layers to respect
Grassmannian geometry and meanwhile achieve Euclidean forms for regular output
layers. To train the Grassmann networks, we exploit a stochastic gradient
descent setting on manifolds of the connection weights, and study a matrix
generalization of backpropagation to update the structured data. The
evaluations on three visual recognition tasks show that our Grassmann networks
have clear advantages over existing Grassmann learning methods, and achieve
results comparable with state-of-the-art approaches.Comment: AAAI'18 pape
Manifold interpolation and model reduction
One approach to parametric and adaptive model reduction is via the
interpolation of orthogonal bases, subspaces or positive definite system
matrices. In all these cases, the sampled inputs stem from matrix sets that
feature a geometric structure and thus form so-called matrix manifolds. This
work will be featured as a chapter in the upcoming Handbook on Model Order
Reduction (P. Benner, S. Grivet-Talocia, A. Quarteroni, G. Rozza, W.H.A.
Schilders, L.M. Silveira, eds, to appear on DE GRUYTER) and reviews the
numerical treatment of the most important matrix manifolds that arise in the
context of model reduction. Moreover, the principal approaches to data
interpolation and Taylor-like extrapolation on matrix manifolds are outlined
and complemented by algorithms in pseudo-code.Comment: 37 pages, 4 figures, featured chapter of upcoming "Handbook on Model
Order Reduction
- …