2 research outputs found

    Generalized Entropy For Splitting On Numerical Attributes In Decision Trees

    No full text
    Decision Trees are well known for their training efficiency and their interpretable knowledge representation. They apply a greedy search and a divide-and-conquer approach to learn patterns. The greedy search is based on the evaluation criterion on the candidate splits at each node. Although research has been performed on various such criteria, there is no significant improvement from the classical split approaches introduced in the early decision tree literature. This paper presents a new evaluation rule to determine candidate splits in decision tree classifiers. The experiments show that this new evaluation rule reduces the size of the resulting tree, while maintaining the trees accuracy
    corecore