1 research outputs found

    Generalized kernel framework for unsupervised spectral methods of dimensionality reduction

    No full text
    This work introduces a generalized kernel perspective for spectral dimensionality reduction approaches. Firstly, an elegant matrix view of kernel principal component analysis (peA) is described. We show the relationship between kernel peA, and conventional peA using a parametric distance. Secondly, we introduce a weighted kernel peA framework followed from least squares support vector machines (LS-SVM). This approach starts with a latent variable that allows to write a relaxed LS-SVM problem. Such a problem is addressed by a primal-dual formulation. As a result, we provide kernel alternatives to spectral methods for dimensionality reduction such as multidimensional scaling, locally linear embedding, and laplacian eigenmaps; as well as a versatile framework to explain weighted peA approaches. Experimentally, we prove that the incorporation of a SVM model improves the performance of kernel peA
    corecore