The best learning algorithms for AUC on the seven test problems
are boosted full decision trees, bagged decision trees,
neural nets, and SVMs. Surprisingly, maximum margin methods
such as SVMs and boosted decision trees yield excellent
AUC performance. We had not expected that maximizing the
margin to a decision boundary would provide a good basis for
ordering cases that fall far from those boundaries. We were
able to obtain surprisingly good AUC performance with each
learning algorithm by very thoroughly tuning each algorithms
parameters. Nevertheless, KNN, plain decision trees (including
smoothed probabilistic trees), and boosted stumps usually
did not yield AUC performance that was competitive with the
best models