8,389 research outputs found

    Multiple Data-Dependent Kernel Fisher Discriminant Analysis for Face Recognition

    Get PDF
    Kernel Fisher discriminant analysis (KFDA) method has demonstrated its success in extracting facial features for face recognition. Compared to linear techniques, it can better describe the complex and nonlinear variations of face images. However, a single kernel is not always suitable for the applications of face recognition which contain data from multiple, heterogeneous sources, such as face images under huge variations of pose, illumination, and facial expression. To improve the performance of KFDA in face recognition, a novel algorithm named multiple data-dependent kernel Fisher discriminant analysis (MDKFDA) is proposed in this paper. The constructed multiple data-dependent kernel (MDK) is a combination of several base kernels with a data-dependent kernel constraint on their weights. By solving the optimization equation based on Fisher criterion and maximizing the margin criterion, the parameter optimization of data-dependent kernel and multiple base kernels is achieved. Experimental results on the three face databases validate the effectiveness of the proposed algorithm

    Neural Class-Specific Regression for face verification

    Get PDF
    Face verification is a problem approached in the literature mainly using nonlinear class-specific subspace learning techniques. While it has been shown that kernel-based Class-Specific Discriminant Analysis is able to provide excellent performance in small- and medium-scale face verification problems, its application in today's large-scale problems is difficult due to its training space and computational requirements. In this paper, generalizing our previous work on kernel-based class-specific discriminant analysis, we show that class-specific subspace learning can be cast as a regression problem. This allows us to derive linear, (reduced) kernel and neural network-based class-specific discriminant analysis methods using efficient batch and/or iterative training schemes, suited for large-scale learning problems. We test the performance of these methods in two datasets describing medium- and large-scale face verification problems.Comment: 9 pages, 4 figure

    Pengenalan Multi Wajah Berdasarkan Klasifikasi Kohonen SOM Dioptimalkan dengan Algoritma Discriminant Analysis PCA

    Get PDF
    Face recognition is a process of identification with the image has variations changeable can be recognized, needs a method of optimization to minimize computational time by not affecting the classification results. This research proposes a face recognition system are directly based on Kohonen SOM classification that optimized by the method of Discriminant Analysis based Principal Component Analysis (PCA). Evaluation of PCA’s extraction performance uses two approaches, first the LDA method to optimize PCA issues of the election of irrelevant features of the dataset and the second approach is to apply a kernel function on the LDA (KDA), the results of both approaches are applied on face image classification for Kohonen directly. The testing is two phases, the first stage is testing with a single image of a face and then multi face. Based on the results of testing one face image, both of the approached feature extraction that proposed is very accurately be applied to the classification of the Kohonen SOM with the accurate value of the second approach PCA-KDA is more accurate with 94.22% and the first approach 93.91%, however on the first approach is faster than the second approach with the accurate value of time 0.4 seconds for PCA-LDA and 0.5 seconds to PCA-KDA to one image of the face, but while testing of multi face more two images the result is not significant. Keywords: Face recognition, Feature extraction, Kohonen SOM

    Nonlinear Supervised Dimensionality Reduction via Smooth Regular Embeddings

    Full text link
    The recovery of the intrinsic geometric structures of data collections is an important problem in data analysis. Supervised extensions of several manifold learning approaches have been proposed in the recent years. Meanwhile, existing methods primarily focus on the embedding of the training data, and the generalization of the embedding to initially unseen test data is rather ignored. In this work, we build on recent theoretical results on the generalization performance of supervised manifold learning algorithms. Motivated by these performance bounds, we propose a supervised manifold learning method that computes a nonlinear embedding while constructing a smooth and regular interpolation function that extends the embedding to the whole data space in order to achieve satisfactory generalization. The embedding and the interpolator are jointly learnt such that the Lipschitz regularity of the interpolator is imposed while ensuring the separation between different classes. Experimental results on several image data sets show that the proposed method outperforms traditional classifiers and the supervised dimensionality reduction algorithms in comparison in terms of classification accuracy in most settings

    A Simple Iterative Algorithm for Parsimonious Binary Kernel Fisher Discrimination

    Get PDF
    By applying recent results in optimization theory variously known as optimization transfer or majorize/minimize algorithms, an algorithm for binary, kernel, Fisher discriminant analysis is introduced that makes use of a non-smooth penalty on the coefficients to provide a parsimonious solution. The problem is converted into a smooth optimization that can be solved iteratively with no greater overhead than iteratively re-weighted least-squares. The result is simple, easily programmed and is shown to perform, in terms of both accuracy and parsimony, as well as or better than a number of leading machine learning algorithms on two well-studied and substantial benchmarks
    • …
    corecore