2 research outputs found

    Least Square Approach to Out-of-Sample Extensions of Diffusion Maps

    Get PDF
    Let X = X βˆͺ Z be a data set in ℝD, where X is the training set and Z the testing one. Assume that a kernel method produces a dimensionality reduction (DR) mapping : X β†’ ℝd (d β‰ͺ D) that maps the high-dimensional data X to its row-dimensional representation Y = (X). The out-of-sample extension of dimensionality reduction problem is to find the dimensionality reduction of X using the extension of instead of re-training the whole data set X. In this paper, utilizing the framework of reproducing kernel Hilbert space theory, we introduce a least-square approach to extensions of the popular DR mappings called Diffusion maps (Dmaps). We establish a theoretic analysis for the out-of-sample DR Dmaps. This analysis also provides a uniform treatment of many popular out-of-sample algorithms based on kernel methods. We illustrate the validity of the developed out-of-sample DR algorithms in several examples
    corecore