37 research outputs found

    Minimum mean square distance estimation of a subspace

    Get PDF
    We consider the problem of subspace estimation in a Bayesian setting. Since we are operating in the Grassmann manifold, the usual approach which consists of minimizing the mean square error (MSE) between the true subspace UU and its estimate U^\hat{U} may not be adequate as the MSE is not the natural metric in the Grassmann manifold. As an alternative, we propose to carry out subspace estimation by minimizing the mean square distance (MSD) between UU and its estimate, where the considered distance is a natural metric in the Grassmann manifold, viz. the distance between the projection matrices. We show that the resulting estimator is no longer the posterior mean of UU but entails computing the principal eigenvectors of the posterior mean of UUTU U^{T}. Derivation of the MMSD estimator is carried out in a few illustrative examples including a linear Gaussian model for the data and a Bingham or von Mises Fisher prior distribution for UU. In all scenarios, posterior distributions are derived and the MMSD estimator is obtained either analytically or implemented via a Markov chain Monte Carlo simulation method. The method is shown to provide accurate estimates even when the number of samples is lower than the dimension of UU. An application to hyperspectral imagery is finally investigated

    A matrix-algebraic algorithm for the Riemannian logarithm on the Stiefel manifold under the canonical metric

    Get PDF
    We derive a numerical algorithm for evaluating the Riemannian logarithm on the Stiefel manifold with respect to the canonical metric. In contrast to the existing optimization-based approach, we work from a purely matrix-algebraic perspective. Moreover, we prove that the algorithm converges locally and exhibits a linear rate of convergence.Comment: 30 pages, 5 figures, Matlab cod

    Parametric Regression on the Grassmannian

    Get PDF
    We address the problem of fitting parametric curves on the Grassmann manifold for the purpose of intrinsic parametric regression. As customary in the literature, we start from the energy minimization formulation of linear least-squares in Euclidean spaces and generalize this concept to general nonflat Riemannian manifolds, following an optimal-control point of view. We then specialize this idea to the Grassmann manifold and demonstrate that it yields a simple, extensible and easy-to-implement solution to the parametric regression problem. In fact, it allows us to extend the basic geodesic model to (1) a time-warped variant and (2) cubic splines. We demonstrate the utility of the proposed solution on different vision problems, such as shape regression as a function of age, traffic-speed estimation and crowd-counting from surveillance video clips. Most notably, these problems can be conveniently solved within the same framework without any specifically-tailored steps along the processing pipeline.Comment: 14 pages, 11 figure

    On the Whitney distortion extension problem for Cm(Rn)C^m(\mathbb R^n) and C∞(Rn)C^{\infty}(\mathbb R^n) and its applications to interpolation and alignment of data in Rn\mathbb R^n

    Full text link
    Let n,m≄1n,m\geq 1, U⊂RnU\subset\mathbb R^n open. In this paper we provide a sharp solution to the following Whitney distortion extension problems: (a) Let ϕ:U→Rn\phi:U\to \mathbb R^n be a CmC^m map. If E⊂UE\subset U is compact (with some geometry) and the restriction of ϕ\phi to EE is an almost isometry with small distortion, how to decide when there exists a Cm(Rn)C^m(\mathbb R^n) one-to-one and onto almost isometry Ί:Rn→Rn\Phi:\mathbb R^n\to \mathbb R^n with small distortion which agrees with ϕ\phi in a neighborhood of EE and a Euclidean motion A:Rn→RnA:\mathbb R^n\to \mathbb R^n away from EE. (b) Let ϕ:U→Rn\phi:U\to \mathbb R^n be C∞C^{\infty} map. If E⊂UE\subset U is compact (with some geometry) and the restriction of ϕ\phi to EE is an almost isometry with small distortion, how to decide when there exists a C∞(Rn)C^{\infty}(\mathbb R^n) one-to-one and onto almost isometry Ί:Rn→Rn\Phi:\mathbb R^n\to \mathbb R^n with small distortion which agrees with ϕ\phi in a neighborhood of EE and a Euclidean motion A:Rn→RnA:\mathbb R^n\to \mathbb R^n away from EE. Our results complement those of [14,15,20] where there, EE is a finite set. In this case, the problem above is also a problem of interpolation and alignment of data in Rn\mathbb R^n.Comment: This is part three of four papers with C. Fefferman (arXiv:1411.2451, arXiv:1411.2468, involve-v5-n2-p03-s.pdf) dealing with the problem of Whitney type extensions of ÎŽ>0\delta>0 distortions from certain compact sets E⊂RnE\subset \Bbb R^n to Δ>0\varepsilon>0 distorted diffeomorphisms on $\Bbb R^n
    corecore