Skip to main content
Article thumbnail
Location of Repository

Inducing safer oblique trees without costs

By S Vadera


Decision tree induction has been widely studied and applied. In safety applications, such as determining whether a chemical process is safe or whether a person has a medical condition, the cost of misclassification in one of the classes is significantly higher than in the other class. Several authors have tackled this problem by developing cost-sensitive decision tree learning algorithms or have suggested ways of changing the\ud distribution of training examples to bias the decision tree learning process so as to take account of costs. A prerequisite for applying such algorithms is the availability of costs of misclassification.\ud Although this may be possible for some applications, obtaining reasonable estimates of costs of misclassification is not easy in the area of safety.\ud This paper presents a new algorithm for applications where the cost of misclassifications cannot be quantified, although the cost of misclassification in one class is known to be significantly higher than in another class. The algorithm utilizes linear discriminant analysis to identify oblique relationships between continuous attributes and then carries out an appropriate modification to ensure that the resulting tree errs on the side of safety. The algorithm is evaluated with respect to one of the best known cost-sensitive algorithms (ICET), a well-known oblique decision tree algorithm (OC1) and an algorithm that utilizes robust linear programming

Topics: QA75, QA, other
Publisher: Wiley Blackwell Publishing
OAI identifier:

Suggested articles


  1. (2000). A comparative study of cost-sensitive boosting algorithms.
  2. (2001). A comparison of cost-sensitive decision tree learning algorithms. doi
  3. (1995). A review of industrial case-based reasoning tools.
  4. (1994). A system for induction of oblique decision trees.
  5. (1999). AdaCost: Misclassification cost-sensitive boosting.
  6. (2001). An analysis of reduced error pruning.
  7. (1989). An empirical comparison of pruning methods for decision tree induction.
  8. (2003). Automatic model selection in cost-sensitive boosting.
  9. (1993). C4.5 : Programs for Machine Learning. California:Morgan Kauffman.
  10. (1993). Case-based Reasoning. Palo Also,
  11. (1984). Classification and Regression Trees.
  12. (1995). Cost sensitive classification: Empirical evaluation of a hybrid genetic decision tree induction algorithm.
  13. (1993). Cost-sensitive learning of classification knowledge and its applications in robotics.
  14. (1994). Cost-sensitive pruning of decision trees.
  15. (2004). Data Mining Techniques (2nd edition).
  16. (1995). Empirical Methods for Artificial Intelligence. Massachusetts :
  17. (1994). Goal-directed classification using linear machine decision trees.
  18. (2005). Inducing Cost-Sensitive Nonlinear Decision Trees,
  19. (1995). Inductive policy: The pragmatics of bias selection.
  20. (2002). Loqo: Optimisation and Applications Web Site.
  21. (1999). MetaCost: A general method for making classifiers cost-sensitive.
  22. (1976). Multivariate Statistical Methods (Second edition).
  23. (1999). On mathematical programming methods and support vector machines. In
  24. (1986). Optimization of control parameters for genetic algorithms.
  25. (1998). Pruning decision trees with misclassification costs.
  26. (1998). Reduced-error pruning with significance tests,
  27. (1994). Reducing misclassification costs: Knowledge-intensive approaches to learning from noisy data.
  28. (1997). Simplifying decision trees: A survey.
  29. (1987). Simplifying decision trees.
  30. (1998). The case against accuracy estimation for comparing induction algorithms.
  31. (1997). The DET curve in assessment of detection task performance.
  32. (2001). The foundations of cost-sensitive learning.
  33. (1991). The use of background knowledge in decision tree induction.
  34. (1998). UCI Repository of Machine Learning Databases. [http: // mlearn/ MLRepository.html],

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.