354 research outputs found

    Neural Class-Specific Regression for face verification

    Get PDF
    Face verification is a problem approached in the literature mainly using nonlinear class-specific subspace learning techniques. While it has been shown that kernel-based Class-Specific Discriminant Analysis is able to provide excellent performance in small- and medium-scale face verification problems, its application in today's large-scale problems is difficult due to its training space and computational requirements. In this paper, generalizing our previous work on kernel-based class-specific discriminant analysis, we show that class-specific subspace learning can be cast as a regression problem. This allows us to derive linear, (reduced) kernel and neural network-based class-specific discriminant analysis methods using efficient batch and/or iterative training schemes, suited for large-scale learning problems. We test the performance of these methods in two datasets describing medium- and large-scale face verification problems.Comment: 9 pages, 4 figure

    Grassmann Learning for Recognition and Classification

    Get PDF
    Computational performance associated with high-dimensional data is a common challenge for real-world classification and recognition systems. Subspace learning has received considerable attention as a means of finding an efficient low-dimensional representation that leads to better classification and efficient processing. A Grassmann manifold is a space that promotes smooth surfaces, where points represent subspaces and the relationship between points is defined by a mapping of an orthogonal matrix. Grassmann learning involves embedding high dimensional subspaces and kernelizing the embedding onto a projection space where distance computations can be effectively performed. In this dissertation, Grassmann learning and its benefits towards action classification and face recognition in terms of accuracy and performance are investigated and evaluated. Grassmannian Sparse Representation (GSR) and Grassmannian Spectral Regression (GRASP) are proposed as Grassmann inspired subspace learning algorithms. GSR is a novel subspace learning algorithm that combines the benefits of Grassmann manifolds with sparse representations using least squares loss §¤1-norm minimization for improved classification. GRASP is a novel subspace learning algorithm that leverages the benefits of Grassmann manifolds and Spectral Regression in a framework that supports high discrimination between classes and achieves computational benefits by using manifold modeling and avoiding eigen-decomposition. The effectiveness of GSR and GRASP is demonstrated for computationally intensive classification problems: (a) multi-view action classification using the IXMAS Multi-View dataset, the i3DPost Multi-View dataset, and the WVU Multi-View dataset, (b) 3D action classification using the MSRAction3D dataset and MSRGesture3D dataset, and (c) face recognition using the ATT Face Database, Labeled Faces in the Wild (LFW), and the Extended Yale Face Database B (YALE). Additional contributions include the definition of Motion History Surfaces (MHS) and Motion Depth Surfaces (MDS) as descriptors suitable for activity representations in video sequences and 3D depth sequences. An in-depth analysis of Grassmann metrics is applied on high dimensional data with different levels of noise and data distributions which reveals that standardized Grassmann kernels are favorable over geodesic metrics on a Grassmann manifold. Finally, an extensive performance analysis is made that supports Grassmann subspace learning as an effective approach for classification and recognition

    A comparative analysis of neural and statistical classifiers for dimensionality reduction-based face recognition systems.

    Get PDF
    Human face recognition has received a wide range of attention since 1990s. Recent approaches focus on a combination of dimensionality reduction-based feature extraction algorithms and various types of classifiers. This thesis provides an in depth comparative analysis of neural and statistical classifiers by combining them with existing dimensionality reduction-based algorithms. A set of unified face recognition systems were established for evaluating alternate combinations in terms of recognition performance, processing time, and conditions to achieve certain performance levels. A preprocessing system and four dimensionality reduction-based methods based on Principal Component Analysis (PCA), Two-dimensional PCA, Fisher\u27s Linear Discriminant and Laplacianfaces were utilized and implemented. Classification was achieved by using various types of classifiers including Euclidean Distance, MLP neural network, K-nearest-neighborhood classifier and Fuzzy K-Nearest Neighbor classifier. The statistical model is relatively simple and requires less computation complexity and storage. Experimental results were shown after the algorithms were tested on two databases of known individuals, Yale and AR database. After comparing these algorithms in every aspect, the results of the simulations showed that considering recognition rates, generalization ability, classification performance, the power of noise immunity and processing time, the best results were obtained with the Laplacianfaces, using either Fuzzy K-NN.Dept. of Electrical and Computer Engineering. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2006 .X86. Source: Masters Abstracts International, Volume: 45-01, page: 0428. Thesis (M.A.Sc.)--University of Windsor (Canada), 2006

    Symmetric Subspace Learning for Image Analysis

    Get PDF

    Intelligent Local Face Recognition

    Get PDF

    A framework of face recognition with set of testing images

    Get PDF
    We propose a novel framework to solve the face recognition problem base on set of testing images. Our framework can handle the case that no pose overlap between training set and query set. The main techniques used in this framework are manifold alignment, face normalization and discriminant learning. Experiments on different databases show our system outperforms some state of the art methods
    • …
    corecore