Alternating decision trees are induced using a real-valued formulation of AdaBoos [14]. At each boosting iteration three nodes are added to the tree. A splitter node that attempts to split sets of instances into pure subsets and two prediction nodes, one for each of the splitter node’s subsets. The position of this new splitter node is determined by examining all predictor nodes choosing the position resulting in the globally best improvement of the purity score.