2,151 research outputs found

    Gradient-orientation-based PCA subspace for novel face recognition

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Face recognition is an interesting and a challenging problem that has been widely studied in the field of pattern recognition and computer vision. It has many applications such as biometric authentication, video surveillance, and others. In the past decade, several methods for face recognition were proposed. However, these methods suffer from pose and illumination variations. In order to address these problems, this paper proposes a novel methodology to recognize the face images. Since image gradients are invariant to illumination and pose variations, the proposed approach uses gradient orientation to handle these effects. The Schur decomposition is used for matrix decomposition and then Schurvalues and Schurvectors are extracted for subspace projection. We call this subspace projection of face features as Schurfaces, which is numerically stable and have the ability of handling defective matrices. The Hausdorff distance is used with the nearest neighbor classifier to measure the similarity between different faces. Experiments are conducted with Yale face database and ORL face database. The results show that the proposed approach is highly discriminant and achieves a promising accuracy for face recognition than the state-of-the-art approaches

    On the Subspace of Image Gradient Orientations

    Full text link
    We introduce the notion of Principal Component Analysis (PCA) of image gradient orientations. As image data is typically noisy, but noise is substantially different from Gaussian, traditional PCA of pixel intensities very often fails to estimate reliably the low-dimensional subspace of a given data population. We show that replacing intensities with gradient orientations and the β„“2\ell_2 norm with a cosine-based distance measure offers, to some extend, a remedy to this problem. Our scheme requires the eigen-decomposition of a covariance matrix and is as computationally efficient as standard β„“2\ell_2 PCA. We demonstrate some of its favorable properties on robust subspace estimation
    • …
    corecore