69,506 research outputs found

    Distance-based discriminant analysis method and its applications

    Get PDF
    This paper proposes a method of finding a discriminative linear transformation that enhances the data's degree of conformance to the compactness hypothesis and its inverse. The problem formulation relies on inter-observation distances only, which is shown to improve non-parametric and non-linear classifier performance on benchmark and real-world data sets. The proposed approach is suitable for both binary and multiple-category classification problems, and can be applied as a dimensionality reduction technique. In the latter case, the number of necessary discriminative dimensions can be determined exactly. Also considered is a kernel-based extension of the proposed discriminant analysis method which overcomes the linearity assumption of the sought discriminative transformation imposed by the initial formulation. This enhancement allows the proposed method to be applied to non-linear classification problems and has an additional benefit of being able to accommodate indefinite kernel

    Two-Dimensional Heteroscedastic Feature Extraction Technique for Face Recognition

    Get PDF
    One limitation of vector-based LDA and its matrix-based extension is that they cannot deal with heteroscedastic data. In this paper, we present a novel two-dimensional feature extraction technique for face recognition which is capable of handling the heteroscedastic data in the dataset. The technique is a general form of two-dimensional linear discriminant analysis. It generalizes the interclass scatter matrix of two-dimensional LDA by applying the Chernoff distance as a measure of separation of every pair of clusters with the same index in different classes. By employing the new distance, our method can capture the discriminatory information presented in the difference of covariance matrices of different clusters in the datasets while preserving the computational simplicity of eigenvalue-based techniques. So our approach is a proper technique for high-dimensional applications such as face recognition. Experimental results on CMU-PIE, AR and AT & T face databases demonstrate the effectiveness of our method in term of classification accuracy

    Partial least squares discriminant analysis: A dimensionality reduction method to classify hyperspectral data

    Get PDF
    The recent development of more sophisticated spectroscopic methods allows acquisition of high dimensional datasets from which valuable information may be extracted using multivariate statistical analyses, such as dimensionality reduction and automatic classification (supervised and unsupervised). In this work, a supervised classification through a partial least squares discriminant analysis (PLS-DA) is performed on the hy- perspectral data. The obtained results are compared with those obtained by the most commonly used classification approaches

    Partial least squares discriminant analysis: A dimensionality reduction method to classify hyperspectral data

    Get PDF
    The recent development of more sophisticated spectroscopic methods allows acqui- sition of high dimensional datasets from which valuable information may be extracted using multivariate statistical analyses, such as dimensionality reduction and automatic classification (supervised and unsupervised). In this work, a supervised classification through a partial least squares discriminant analysis (PLS-DA) is performed on the hy- perspectral data. The obtained results are compared with those obtained by the most commonly used classification approaches
    • …
    corecore