7,347 research outputs found

    Analisis Terhadap Fungsi Kernel Inverse Multi Quadric Pada Metode Kernel Direct Discriminant Analysis Untuk Pengenalan Wajah

    Get PDF
    ABSTRAKSI: ABSTRAK Algoritma KDDA (Kernel Direct Discriminant Analysis) merupakan salah satu algoritma yang digunakan oleh para pakar pattern recognition untuk menangani masalah ekstraksi ciri wajah dengan proses mendapatkan ciri – ciri pembeda yang dapat membedakan suatu sampel wajah dari sampel wajah yang lain. Algoritma ini merupakan pengembangan dari algoritma DLDA (Direct Linear Discriminant Analysis) dan GDA (Generalized Discriminant Analysis) namun yang membedakannya terletak pada penggunaan fungsi kernel yang terbukti mampu mengatasi permasalahan nonlinier yang dihadapi algoritma linier ekstraksi ciri wajah lain seperti algoritma linier PCA (Principle Component Analysis) ataupun algoritma LDA (Linear Discriminant Analysis). Pada tugas akhir ini telah dilakukan analisis performansi dari algoritma KDDA dengan menspesifikasikan pemakaian fungsi kernel inverse multi quadric dalam melakukan pengujian dan membandingkan dengan fungsi kernel yang lain (kernel polynomial dan kernel Gaussian RBF) sehingga menghasilkan tingkat akurasi yang optimal dengan merepresentasikannya ke dalam persamaan regresi. Hasil pengujian menunjukkan bahwa fungsi kernel inverse multi quadric memiliki tingkat akurasi 30%-53% (2 sampel), 43%-63% (3 sampel), 50%-77% (4 sampel), 58%-95% (5 sampel), 54%-98% (6 sampel), 65%-100% (7 sampel). Variabel jumlah sampel berkorelasi kuat terhadap variabel akurasi sebesar 0,904 daripada variabel σ sebesar 0.028 dan variabel c sebesar -0.056. Koefisien jumlah sampel sebesar 9,893 dan koefisien σ sebesar 159,680 dapat mempengaruhi tingkat akurasi, sedangkan koefisien c sebesar - 0,611 dapat mengurangi tingkat akurasi pengenalan wajah pada metode KDDA.Kata Kunci : KDDA, Inverse Multi quadric, Polynomial, Gaussian RBFABSTRACT: ABSTRACT The Kernel Direct Discriminant Analysis (KDDA) algorithm is one of the algorithms used by pattern recognition experts to handle face feature extraction problems through the process of acquiring distinctive characteristics that is able to differenciate a certain face sample from the others. This algorithm was derived from Direct linear Discriminant Analysis (DLDA) and Generalized Discriminant Analysis (GDA), but what makes KDDA different is the utilization of kernel function which is proven to be able to solve non-linear problems found when using other face feature extraction linear algorithms such as Principle Component Analysis (PCA) and Linear Discriminant Analysis (LDA). For this final project, a performance analysis of KDDA algorithm is done by specifying the use of inverse multi quadric kernel function through tests and comparing it with other kernel functions (polynomial kernel and Gaussian RBF kernel) to acquire optimal accurations level by representing it to a regression equation. The tests results showed that inverse multi quadric kernel function has better accurations level 30%-53% (2 samples), 43%-63% (3 samples), 50%-77% (4 samples), 58%-95% (5 samples), 54%-98% (6 samples), 65%-100% (7 samples). Number of samples has strong correlation to accurations as much as 0,904, σ is equal to 0.028 and c is equal to -0.056. Number of samples coefficient equals 9,893 and σ equals 159,680 can influence accurations level, while coefficient c equals -0,611 can reduce accurations level face recognition at KDDA method.Keyword: KDDA, Inverse Multi quadric, Polynomial, Gaussian RB

    Direct kernel biased discriminant analysis: a new content-based image retrieval relevance feedback algorithm

    Get PDF
    In recent years, a variety of relevance feedback (RF) schemes have been developed to improve the performance of content-based image retrieval (CBIR). Given user feedback information, the key to a RF scheme is how to select a subset of image features to construct a suitable dissimilarity measure. Among various RF schemes, biased discriminant analysis (BDA) based RF is one of the most promising. It is based on the observation that all positive samples are alike, while in general each negative sample is negative in its own way. However, to use BDA, the small sample size (SSS) problem is a big challenge, as users tend to give a small number of feedback samples. To explore solutions to this issue, this paper proposes a direct kernel BDA (DKBDA), which is less sensitive to SSS. An incremental DKBDA (IDKBDA) is also developed to speed up the analysis. Experimental results are reported on a real-world image collection to demonstrate that the proposed methods outperform the traditional kernel BDA (KBDA) and the support vector machine (SVM) based RF algorithms

    A Simple Iterative Algorithm for Parsimonious Binary Kernel Fisher Discrimination

    Get PDF
    By applying recent results in optimization theory variously known as optimization transfer or majorize/minimize algorithms, an algorithm for binary, kernel, Fisher discriminant analysis is introduced that makes use of a non-smooth penalty on the coefficients to provide a parsimonious solution. The problem is converted into a smooth optimization that can be solved iteratively with no greater overhead than iteratively re-weighted least-squares. The result is simple, easily programmed and is shown to perform, in terms of both accuracy and parsimony, as well as or better than a number of leading machine learning algorithms on two well-studied and substantial benchmarks

    Quadratic Projection Based Feature Extraction with Its Application to Biometric Recognition

    Full text link
    This paper presents a novel quadratic projection based feature extraction framework, where a set of quadratic matrices is learned to distinguish each class from all other classes. We formulate quadratic matrix learning (QML) as a standard semidefinite programming (SDP) problem. However, the con- ventional interior-point SDP solvers do not scale well to the problem of QML for high-dimensional data. To solve the scalability of QML, we develop an efficient algorithm, termed DualQML, based on the Lagrange duality theory, to extract nonlinear features. To evaluate the feasibility and effectiveness of the proposed framework, we conduct extensive experiments on biometric recognition. Experimental results on three representative biometric recogni- tion tasks, including face, palmprint, and ear recognition, demonstrate the superiority of the DualQML-based feature extraction algorithm compared to the current state-of-the-art algorithm

    Sparse multinomial kernel discriminant analysis (sMKDA)

    No full text
    Dimensionality reduction via canonical variate analysis (CVA) is important for pattern recognition and has been extended variously to permit more flexibility, e.g. by "kernelizing" the formulation. This can lead to over-fitting, usually ameliorated by regularization. Here, a method for sparse, multinomial kernel discriminant analysis (sMKDA) is proposed, using a sparse basis to control complexity. It is based on the connection between CVA and least-squares, and uses forward selection via orthogonal least-squares to approximate a basis, generalizing a similar approach for binomial problems. Classification can be performed directly via minimum Mahalanobis distance in the canonical variates. sMKDA achieves state-of-the-art performance in terms of accuracy and sparseness on 11 benchmark datasets

    Adaptive Graph via Multiple Kernel Learning for Nonnegative Matrix Factorization

    Full text link
    Nonnegative Matrix Factorization (NMF) has been continuously evolving in several areas like pattern recognition and information retrieval methods. It factorizes a matrix into a product of 2 low-rank non-negative matrices that will define parts-based, and linear representation of nonnegative data. Recently, Graph regularized NMF (GrNMF) is proposed to find a compact representation,which uncovers the hidden semantics and simultaneously respects the intrinsic geometric structure. In GNMF, an affinity graph is constructed from the original data space to encode the geometrical information. In this paper, we propose a novel idea which engages a Multiple Kernel Learning approach into refining the graph structure that reflects the factorization of the matrix and the new data space. The GrNMF is improved by utilizing the graph refined by the kernel learning, and then a novel kernel learning method is introduced under the GrNMF framework. Our approach shows encouraging results of the proposed algorithm in comparison to the state-of-the-art clustering algorithms like NMF, GrNMF, SVD etc.Comment: This paper has been withdrawn by the author due to the terrible writin
    corecore