2 research outputs found

    Revisiting Classical Multiclass Linear Discriminant Analysis with a Novel Prototype-based Interpretable Solution

    Full text link
    Linear discriminant analysis (LDA) is a fundamental method for feature extraction and dimensionality reduction. Despite having many variants, classical LDA has its own importance, as it is a keystone in human knowledge about statistical pattern recognition. For a dataset containing C clusters, the classical solution to LDA extracts at most C-1 features. Here, we introduce a novel solution to classical LDA, called LDA++, that yields C features, each interpretable as measuring similarity to one cluster. This novel solution bridges dimensionality reduction and multiclass classification. Specifically, we prove that, for homoscedastic Gaussian data and under some mild conditions, the optimal weights of a linear multiclass classifier also make an optimal solution to LDA. In addition, we show that LDA++ reveals some important new facts about LDA that remarkably changes our understanding of classical multiclass LDA after 75 years of its introduction. We provide a complete numerical solution for LDA++ for the cases 1) when the scatter matrices can be constructed explicitly, 2) when constructing the scatter matrices is infeasible, and 3) the kernel extension
    corecore