43 research outputs found

    Building Combined Classifiers

    Get PDF
    This chapter covers different approaches that may be taken when building an ensemble method, through studying specific examples of each approach from research conducted by the authors. A method called Negative Correlation Learning illustrates a decision level combination approach with individual classifiers trained co-operatively. The Model level combination paradigm is illustrated via a tree combination method. Finally, another variant of the decision level paradigm, with individuals trained independently instead of co-operatively, is discussed as applied to churn prediction in the telecommunications industry

    Hybrid Committee Classifier for a Computerized Colonic Polyp Detection System

    Get PDF
    We present a hybrid committee classifier for computer-aided detection (CAD) of colonic polyps in CT colonography (CTC). The classifier involved an ensemble of support vector machines (SVM) and neural networks (NN) for classification, a progressive search algorithm for selecting a set of features used by the SVMs and a floating search algorithm for selecting features used by the NNs. A total of 102 quantitative features were calculated for each polyp candidate found by a prototype CAD system. 3 features were selected for each of 7 SVM classifiers which were then combined to form a committee of SVMs classifier. Similarly, features (numbers varied from 10-20) were selected for 11 NN classifiers which were again combined to form a NN committee classifier. Finally, a hybrid committee classifier was defined by combining the outputs of both the SVM and NN committees. The method was tested on CTC scans (supine and prone views) of 29 patients, in terms of the partial area under a free response receiving operation characteristic (FROC) curve (AUC). Our results showed that the hybrid committee classifier performed the best for the prone scans and was comparable to other classifiers for the supine scans

    Ensembles de classificadores para bases de dados desbalanceadas: uma abordagem baseada em amostragem evolucionária

    Get PDF
    Em muitos problemas práticos de classificação, o conjunto de dados a ser utilizado para a indução do classificador é significativamente desbalanceado. Isso ocorre quando a quantidade de exemplos de determinada classe é muito inferior à(s) da(s) outra(s) classe(s). Conjuntos de dados desbalanceados podem comprometer o desempenho da maioria dos algoritmos clássicos de classificação, uma vez que estes assumem uma distribuição de exemplos equilibrada entre as classes. Por outro lado, em diferentes cenários de aplicação, a estratégia de combinar vários classificadores em estruturas conhecidas como ensembles tem se mostrado bastante eficaz, levando a uma acurácia preditiva estável e, muitas vezes, superior àquela obtida por um classificador isoladamente. Nesse contexto, este trabalho propõe uma nova abordagem para lidar com conjuntos de dados desbalanceados, a qual utiliza ensembles de classificadores induzidos a partir de amostras balanceadas do conjunto de dados original. Para tanto, utiliza-se algoritmo genético multiobjetivo, que evolui a combinação dos exemplos que compõe as amostras balanceadas, levando em consideração a diversidade e o valor da área sob a curva ROC (AUC) dos classificadores induzidos por estas amostras.FAPES

    BaggingLMS: A bagging-based linear fusion with least-mean-square error update for regression

    Get PDF
    The merits of linear decision fusion in multiple learner systems have been widely accepted, and their practical applications are rich in literature. In this paper we present a new linear decision fusion strategy named Bagging.LMS, which takes advantage of the least-mean-square (LMS) algorithm to update the fusion parameters in the Bagging ensemble systems. In the regression experiments on four synthetic and two benchmark data sets, we compared this method with the Bagging-based Simple Average and Adaptive Mixture of Experts ensemble methods. The empirical results show that the Bagging.LMS method may significantly reduce the regression errors versus the other two types of Bagging ensembles, which indicates the superiority of the suggested Bagging.LMS method
    corecore