2,614 research outputs found
Tangent space estimation for smooth embeddings of Riemannian manifolds
Numerous dimensionality reduction problems in data analysis involve the
recovery of low-dimensional models or the learning of manifolds underlying sets
of data. Many manifold learning methods require the estimation of the tangent
space of the manifold at a point from locally available data samples. Local
sampling conditions such as (i) the size of the neighborhood (sampling width)
and (ii) the number of samples in the neighborhood (sampling density) affect
the performance of learning algorithms. In this work, we propose a theoretical
analysis of local sampling conditions for the estimation of the tangent space
at a point P lying on a m-dimensional Riemannian manifold S in R^n. Assuming a
smooth embedding of S in R^n, we estimate the tangent space T_P S by performing
a Principal Component Analysis (PCA) on points sampled from the neighborhood of
P on S. Our analysis explicitly takes into account the second order properties
of the manifold at P, namely the principal curvatures as well as the higher
order terms. We consider a random sampling framework and leverage recent
results from random matrix theory to derive conditions on the sampling width
and the local sampling density for an accurate estimation of tangent subspaces.
We measure the estimation accuracy by the angle between the estimated tangent
space and the true tangent space T_P S and we give conditions for this angle to
be bounded with high probability. In particular, we observe that the local
sampling conditions are highly dependent on the correlation between the
components in the second-order local approximation of the manifold. We finally
provide numerical simulations to validate our theoretical findings
A Framework for Generalising the Newton Method and Other Iterative Methods from Euclidean Space to Manifolds
The Newton iteration is a popular method for minimising a cost function on
Euclidean space. Various generalisations to cost functions defined on manifolds
appear in the literature. In each case, the convergence rate of the generalised
Newton iteration needed establishing from first principles. The present paper
presents a framework for generalising iterative methods from Euclidean space to
manifolds that ensures local convergence rates are preserved. It applies to any
(memoryless) iterative method computing a coordinate independent property of a
function (such as a zero or a local minimum). All possible Newton methods on
manifolds are believed to come under this framework. Changes of coordinates,
and not any Riemannian structure, are shown to play a natural role in lifting
the Newton method to a manifold. The framework also gives new insight into the
design of Newton methods in general.Comment: 36 page
A study of the classification of low-dimensional data with supervised manifold learning
Supervised manifold learning methods learn data representations by preserving
the geometric structure of data while enhancing the separation between data
samples from different classes. In this work, we propose a theoretical study of
supervised manifold learning for classification. We consider nonlinear
dimensionality reduction algorithms that yield linearly separable embeddings of
training data and present generalization bounds for this type of algorithms. A
necessary condition for satisfactory generalization performance is that the
embedding allow the construction of a sufficiently regular interpolation
function in relation with the separation margin of the embedding. We show that
for supervised embeddings satisfying this condition, the classification error
decays at an exponential rate with the number of training samples. Finally, we
examine the separability of supervised nonlinear embeddings that aim to
preserve the low-dimensional geometric structure of data based on graph
representations. The proposed analysis is supported by experiments on several
real data sets
Convergence analysis of Riemannian Gauss-Newton methods and its connection with the geometric condition number
We obtain estimates of the multiplicative constants appearing in local
convergence results of the Riemannian Gauss-Newton method for least squares
problems on manifolds and relate them to the geometric condition number of [P.
B\"urgisser and F. Cucker, Condition: The Geometry of Numerical Algorithms,
2013]
Learning gradients on manifolds
A common belief in high-dimensional data analysis is that data are
concentrated on a low-dimensional manifold. This motivates simultaneous
dimension reduction and regression on manifolds. We provide an algorithm for
learning gradients on manifolds for dimension reduction for high-dimensional
data with few observations. We obtain generalization error bounds for the
gradient estimates and show that the convergence rate depends on the intrinsic
dimension of the manifold and not on the dimension of the ambient space. We
illustrate the efficacy of this approach empirically on simulated and real data
and compare the method to other dimension reduction procedures.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ206 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- …