213 research outputs found

    Matrix Learning in Learning Vector Quantization

    Get PDF

    Matrix Learning in Learning Vector Quantization

    Get PDF

    A survey of kernel and spectral methods for clustering

    Get PDF
    Clustering algorithms are a useful tool to explore data structures and have been employed in many disciplines. The focus of this paper is the partitioning clustering problem with a special interest in two recent approaches: kernel and spectral methods. The aim of this paper is to present a survey of kernel and spectral clustering methods, two approaches able to produce nonlinear separating hypersurfaces between clusters. The presented kernel clustering methods are the kernel version of many classical clustering algorithms, e.g., K-means, SOM and neural gas. Spectral clustering arise from concepts in spectral graph theory and the clustering problem is configured as a graph cut problem where an appropriate objective function has to be optimized. An explicit proof of the fact that these two paradigms have the same objective is reported since it has been proven that these two seemingly different approaches have the same mathematical foundation. Besides, fuzzy kernel clustering methods are presented as extensions of kernel K-means clustering algorithm. (C) 2007 Pattem Recognition Society. Published by Elsevier Ltd. All rights reserved

    Non-Euclidean principal component analysis by Hebbian learning

    Get PDF
    Principal component analysis based on Hebbian learning is originally designed for data processing inEuclidean spaces. We present in this contribution an extension of Oja's Hebbian learning approach fornon-Euclidean spaces. We show that for Banach spaces the Hebbian learning can be carried out using theunderlying semi-inner product. Prominent examples for such Banach spaces are the lp-spaces for pā‰ 2.For kernels spaces, as applied in support vector machines or kernelized vector quantization, thisapproach can be formulated as an online learning scheme based on the differentiable kernel. Hence,principal component analysis can be explicitly carried out in the respective data spaces but nowequipped with a non-Euclidean metric. In the article we provide the theoretical framework and giveillustrative examples

    Differentiable Kernels in Generalized Matrix Learning Vector Quantization

    Get PDF
    In the present paper we investigate the application of differentiable kernel for generalized matrix learning vector quantization as an alternative kernel-based classifier, which additionally provides classification dependent data visualization. We show that the concept of differentiable kernels allows a prototype description in the data space but equipped with the kernel metric. Moreover, using the visualization properties of the original matrix learning vector quantization we are able to optimize the class visualization by inherent visualization mapping learning also in this new kernel-metric data space
    • ā€¦
    corecore