Skip to main content
Article thumbnail
Location of Repository

Building Combined Classifiers

By Mark Eastwood and Bogdan Gabrys


This chapter covers different approaches that may be taken when building an\ud ensemble method, through studying specific examples of each approach from research\ud conducted by the authors. A method called Negative Correlation Learning illustrates a\ud decision level combination approach with individual classifiers trained co-operatively. The\ud Model level combination paradigm is illustrated via a tree combination method. Finally,\ud another variant of the decision level paradigm, with individuals trained independently\ud instead of co-operatively, is discussed as applied to churn prediction in the\ud telecommunications industry

Topics: aintel, csi
Publisher: EXIT Publishing House
Year: 2008
OAI identifier:

Suggested articles


  1. (2003). A constructive algorithm for training cooperative neural network ensembles. doi
  2. (1998). A discriminative framework for detecting remote protein homologies. doi
  3. (1998). A fast, bottom-up decision tree pruning algorithm with near-optimal generalization. in
  4. (1997). A Gentle Tutorial on the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models.
  5. (2000). A Unified Bias-Variance Decomposition and its Applications. in
  6. (2001). Adaptation of pitch and spectrum for HMM-based speech synthesis using MLLR. doi
  7. (1996). Analysis of decision boundaries in linearly combined neural classifiers. Pattern Recognition, doi
  8. (2001). Analyzing Anticorrelation in Ensemble Learning.
  9. (1996). Bagging Predictors. doi
  10. (1996). Bias Plus Variance Decomposition for Zero-One Loss Functions.
  11. (1996). Bias, Variance, and Arcing Classifiers,
  12. Classifier Selection for Majority Voting. Information Fusion, doi
  13. (1998). Combining Particles and Waves for Fluid Animation.
  14. (2004). Combining Pattern Classifiers: Methods and Algorithms. doi
  15. (1999). Combining Predictors: Comparison of Five Meta Machine Learning Methods. Information Sciences, doi
  16. (2005). Diversity creation methods: A survey and categorisation. doi
  17. Ensemble learning via negative correlation. doi
  18. (1996). Error Correlation and Error Reduction in Ensemble Classifiers. Connection Science, doi
  19. (1996). Experiments with a new boosting algorithm.
  20. (2001). Improving prediction of customer behaviour in non-stationary environments.
  21. (2004). Introduction to Statistical Learning Theory, doi
  22. (2006). K Nearest Sequence Method and Its Application to Churn Prediction. in IDEAL. doi
  23. (2007). Kernel Combination Versus Classifier Combination. in doi
  24. (1998). Knowledge discovery via multiple models. doi
  25. (2004). Learning Hybrid Neuro-Fuzzy Classifier Models From Data: To Combine or not ot Combine? Fuzzy Sets and Systems, doi
  26. (2002). Machine Learning for Sequential Data: A Review. doi
  27. (1995). Neural Network Ensembles, Cross Validation, and Active Learning.
  28. (1999). Popular Ensemble Methods: An Empirical Study.
  29. (2001). Random Forests. doi
  30. (1988). Simplifying Decision Trees, in Knowledge Acquisition for KnowledgeBased Systems,
  31. (2007). The Dynamics of Negative Correlation Learning. doi
  32. The Use of the Ambiguity Decomposition in Neural Network Ensemble Learning Methods.
  33. (1996). Weight averaging for neural networks and local resampling schemes.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.