Article thumbnail
Location of Repository

Pattern Recognition with Slow Feature Analysis

By Pietro Berkes

Abstract

Slow feature analysis (SFA) is a new unsupervised algorithm to learn nonlinear functions that extract slowly varying signals out of the input data. In this paper we describe its application to pattern recognition. In this context in order to be slowly varying the functions learned by SFA need to respond similarly to the patterns belonging to the same class. We prove that, given input patterns belonging to C non-overlapping classes and a large enough function space, the optimal solution consists of C-1 output signals that are constant for each individual class. As a consequence, their output provides a feature space suitable to perform classification with simple methods, such as Gaussian classifiers. We then show as an example the application of SFA to the MNIST handwritten digits database. The performance of SFA is comparable to that of other established algorithms. Finally, we suggest some possible extensions to the proposed method. Our approach is in particular attractive because for a given input signal and a fixed function space it has no parameters, it is easy to implement and apply, and it has low memory requirements and high speed during recognition. SFA finds the global solution (within the considered function space) in a single iteration without convergence issues. Moreover, the proposed method is completely problem-independent

Topics: Computational Neuroscience, Machine Learning, Neural Nets
Year: 2005
OAI identifier: oai:cogprints.org:4104

Suggested articles

Citations

  1. (1995). Neural Networks for Pattern Recognition.
  2. (2002). Slow feature analysis: Unsupervised learning of invariances.
  3. (2000). Transformation invariance in pattern recognition – tangent distance and tangent propagation.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.