2 research outputs found
Non-Euclidean principal component analysis by Hebbian learning
Principal component analysis based on Hebbian learning is originally designed for data processing inEuclidean spaces. We present in this contribution an extension of Oja's Hebbian learning approach fornon-Euclidean spaces. We show that for Banach spaces the Hebbian learning can be carried out using theunderlying semi-inner product. Prominent examples for such Banach spaces are the lp-spaces for p≠2.For kernels spaces, as applied in support vector machines or kernelized vector quantization, thisapproach can be formulated as an online learning scheme based on the differentiable kernel. Hence,principal component analysis can be explicitly carried out in the respective data spaces but nowequipped with a non-Euclidean metric. In the article we provide the theoretical framework and giveillustrative examples
Functional principal component learning using Oja's method and Sobolev norms
Villmann T, Hammer B. Functional principal component learning using Oja's method and Sobolev norms. In: Principe JC, Miikkulainen R, eds. Advances in Self-Organizing Maps. 2009: 325-333