Location of Repository

The Chebyshev Approximation for the Histogram χ 2 Kernel and Principal Components of Fourier Features

By Fuxin Li, Guy Lebanon and Cristian Sminchisescu

Abstract

The random Fourier embedding methodology can be used to approximate the performance of non-linear kernel classifiers in linear time in the number of training examples. However, there still exists a non-trivial performance gap between the approximation and the nonlinear models, especially for the exponential χ 2 kernel, one of the most powerful models for histograms. Based on analogies with Chebyshev polynomials, we propose an asymptotically convergent analytic series that can be used in the random Fourier approximation of the exponential χ 2 kernel. The new series removes the need to use periodic approximations to the χ 2 function, as typical in previous methods, and improves the classification accuracy. Besides, out-of-core principal component analysis (PCA) methods are introduced to reduce the dimensionality of the approximation and achieve better performance at the expense of only an additional constant factor to the time complexity. Moreoever, when PCA is performed jointly on the training and unlabeled testing data, a further performance improvement can be obtained. The proposed approaches are tested on the PASCAL VOC 2010 segmentation and the ImageNet ILSVRC 2010 datasets, and give statistically significant improvements over alternative approximation methods

Year: 2012
OAI identifier: oai:CiteSeerX.psu:10.1.1.360.1730
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cc.gatech.edu/~fli/... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.