29,655 research outputs found

    Sparse multinomial kernel discriminant analysis (sMKDA)

    No full text
    Dimensionality reduction via canonical variate analysis (CVA) is important for pattern recognition and has been extended variously to permit more flexibility, e.g. by "kernelizing" the formulation. This can lead to over-fitting, usually ameliorated by regularization. Here, a method for sparse, multinomial kernel discriminant analysis (sMKDA) is proposed, using a sparse basis to control complexity. It is based on the connection between CVA and least-squares, and uses forward selection via orthogonal least-squares to approximate a basis, generalizing a similar approach for binomial problems. Classification can be performed directly via minimum Mahalanobis distance in the canonical variates. sMKDA achieves state-of-the-art performance in terms of accuracy and sparseness on 11 benchmark datasets

    Sparse Discriminant Analysis

    Get PDF
    tionanddimensionreductionareofgreatimportanceiscommonin Classi cationinhigh-dimensionalfeaturespaceswhereinterpreta-biologicalandmedicalapplications. methodsasmicroarrays,1DNMR,andspectroscopyhavebecomeev- Fortheseapplicationsstandard erydaytoolsformeasuringthousandsoffeaturesinsamplesofinterest. Furthermore,thesamplesareoftencostlyandthereforemanysuch problemshavefewobservationsinrelationtothenumberoffeatures. Traditionallysuchdataareanalyzedby lectionbeforeclassi cation. Weproposeamethodwhichperforms rstperformingafeaturese-lineardiscriminantanalysiswithasparsenesscriterionimposedsuch thattheclassi mergedintooneanalysis. cation, featureselectionanddimensionreductionis thantraditionalfeatureselectionmethodsbasedoncomputationally Thesparsediscriminantanalysisisfaster heavycriteriasuchasWilk'slambda,andtheresultsarebetterwith regardstoclassi tomixturesofGaussianswhichisusefulwhene.g.biologicalclusters cationratesandsparseness.Themethodisextended arepresentwithineachclass. low-dimensionalviewsofthediscriminativedirections. Finally,themethodsproposedprovide 1

    A direct approach for sparse quadratic discriminant analysis

    Get PDF
    Quadratic discriminant analysis (QDA) is a standard tool for classification due to its simplicity and flexibility. Because the number of its parameters scales quadratically with the number of the variables, QDA is not practical, however, when the dimensionality is relatively large. To address this, we propose a novel procedure named DA-QDA for QDA in analyzing high-dimensional data. Formulated in a simple and coherent framework, DA-QDA aims to directly estimate the key quantities in the Bayes discriminant function including quadratic interactions and a linear index of the variables for classification. Under appropriate sparsity assumptions, we establish consistency results for estimating the interactions and the linear index, and further demonstrate that the misclassification rate of our procedure converges to the optimal Bayes risk, even when the dimensionality is exponentially high with respect to the sample size. An efficient algorithm based on the alternating direction method of multipliers (ADMM) is developed for finding interactions, which is much faster than its competitor in the literature. The promising performance of DA-QDA is illustrated via extensive simulation studies and the analysis of four real datasets
    corecore