159 research outputs found
A System for Induction of Oblique Decision Trees
This article describes a new system for induction of oblique decision trees.
This system, OC1, combines deterministic hill-climbing with two forms of
randomization to find a good oblique split (in the form of a hyperplane) at
each node of a decision tree. Oblique decision tree methods are tuned
especially for domains in which the attributes are numeric, although they can
be adapted to symbolic or mixed symbolic/numeric attributes. We present
extensive empirical studies, using both real and artificial data, that analyze
OC1's ability to construct oblique trees that are smaller and more accurate
than their axis-parallel counterparts. We also examine the benefits of
randomization for the construction of oblique decision trees.Comment: See http://www.jair.org/ for an online appendix and other files
accompanying this articl
Decision support methods in diabetic patient management by insulin administration neural network vs. induction methods for knowledge classification
Diabetes mellitus is now recognised as a major worldwide
public health problem. At present, about 100
million people are registered as diabetic patients. Many
clinical, social and economic problems occur as a
consequence of insulin-dependent diabetes. Treatment
attempts to prevent or delay complications by applying
‘optimal’ glycaemic control. Therefore, there is a
continuous need for effective monitoring of the patient.
Given the popularity of decision tree learning
algorithms as well as neural networks for knowledge
classification which is further used for decision
support, this paper examines their relative merits by
applying one algorithm from each family on a medical
problem; that of recommending a particular diabetes
regime. For the purposes of this study, OC1 a
descendant of Quinlan’s ID3 algorithm was chosen as
decision tree learning algorithm and a generating
shrinking algorithm for learning arbitrary
classifications as a neural network algorithm. These
systems were trained on 646 cases derived from two
countries in Europe and were tested on 100 cases
which were different from the original 646 cases
Differential Evolution Algorithm in the Construction of Interpretable Classification Models
In this chapter, the application of a differential evolution-based approach to induce oblique decision trees (DTs) is described. This type of decision trees uses a linear combination of attributes to build oblique hyperplanes dividing the instance space. Oblique decision trees are more compact and accurate than the traditional univariate decision trees. On the other hand, as differential evolution (DE) is an efficient evolutionary algorithm (EA) designed to solve optimization problems with real-valued parameters, and since finding an optimal hyperplane is a hard computing task, this metaheuristic (MH) is chosen to conduct an intelligent search of a near-optimal solution. Two methods are described in this chapter: one implementing a recursive partitioning strategy to find the most suitable oblique hyperplane of each internal node of a decision tree, and the other conducting a global search of a near-optimal oblique decision tree. A statistical analysis of the experimental results suggests that these methods show better performance as decision tree induction procedures in comparison with other supervised learning approaches
- …