55 research outputs found
The geometry of low-rank Kalman filters
An important property of the Kalman filter is that the underlying Riccati
flow is a contraction for the natural metric of the cone of symmetric positive
definite matrices. The present paper studies the geometry of a low-rank version
of the Kalman filter. The underlying Riccati flow evolves on the manifold of
fixed rank symmetric positive semidefinite matrices. Contraction properties of
the low-rank flow are studied by means of a suitable metric recently introduced
by the authors.Comment: Final version published in Matrix Information Geometry, pp53-68,
Springer Verlag, 201
Stochastic gradient descent on Riemannian manifolds
Stochastic gradient descent is a simple approach to find the local minima of
a cost function whose evaluations are corrupted by noise. In this paper, we
develop a procedure extending stochastic gradient descent algorithms to the
case where the function is defined on a Riemannian manifold. We prove that, as
in the Euclidian case, the gradient descent algorithm converges to a critical
point of the cost function. The algorithm has numerous potential applications,
and is illustrated here by four examples. In particular a novel gossip
algorithm on the set of covariance matrices is derived and tested numerically.Comment: A slightly shorter version has been published in IEEE Transactions
Automatic Contro
Rank-preserving geometric means of positive semi-definite matrices
The generalization of the geometric mean of positive scalars to positive
definite matrices has attracted considerable attention since the seminal work
of Ando. The paper generalizes this framework of matrix means by proposing the
definition of a rank-preserving mean for two or an arbitrary number of positive
semi-definite matrices of fixed rank. The proposed mean is shown to be
geometric in that it satisfies all the expected properties of a rank-preserving
geometric mean. The work is motivated by operations on low-rank approximations
of positive definite matrices in high-dimensional spaces.Comment: To appear in Linear Algebra and its Application
Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
The paper addresses the problem of learning a regression model parameterized
by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear
nature of the search space and on scalability to high-dimensional problems. The
mathematical developments rely on the theory of gradient descent algorithms
adapted to the Riemannian geometry that underlies the set of fixed-rank
positive semidefinite matrices. In contrast with previous contributions in the
literature, no restrictions are imposed on the range space of the learned
matrix. The resulting algorithms maintain a linear complexity in the problem
size and enjoy important invariance properties. We apply the proposed
algorithms to the problem of learning a distance function parameterized by a
positive semidefinite matrix. Good performance is observed on classical
benchmarks
A Separation Principle on Lie Groups
For linear time-invariant systems, a separation principle holds: stable
observer and stable state feedback can be designed for the time-invariant
system, and the combined observer and feedback will be stable. For non-linear
systems, a local separation principle holds around steady-states, as the
linearized system is time-invariant. This paper addresses the issue of a
non-linear separation principle on Lie groups. For invariant systems on Lie
groups, we prove there exists a large set of (time-varying) trajectories around
which the linearized observer-controler system is time-invariant, as soon as a
symmetry-preserving observer is used. Thus a separation principle holds around
those trajectories. The theory is illustrated by a mobile robot example, and
the developed ideas are then extended to a class of Lagrangian mechanical
systems on Lie groups described by Euler-Poincare equations.Comment: Submitted to IFAC 201
- …