Decision-tree (Quinlan 1993; Breiman et al. 1984)
are commonly built by recursive partitioning. A univariate
(single attribute) split is chosen for the root
of the tree using some criterion (e.g., mutual information,
gain-ratio, gini index). The data is then divided
according to the test, and the process repeats
recursively for each child. After a full tree is built, a
pruning step is executed, which reduces the tree size.
In the experiments, we compared our results with the
C4.5 decision-tree induction algorithm (Quinlan 1993),
which is a state-of-the-art algorithm.