We present an empirical comparison of the AUC performance
of seven supervised learning methods: SVMs, neural
nets, decision trees, k-nearest neighbor, bagged trees, boosted
trees, and boosted stumps. Overall, boosted trees have the
best average AUC performance, followed by bagged trees, neural
nets and SVMs. We then present an ensemble selection
method that yields even better AUC. Ensembles are built
with forward stepwise selection, the model that maximizes
ensemble AUC performance being added at each step. The
proposed method builds ensembles that outperform the best
individual model on all the seven test problems.
We present an empirical comparison of the AUC performanceof seven supervised learning methods: SVMs, neuralnets, decision trees, k-nearest neighbor, bagged trees, boostedtrees, and boosted stumps. Overall, boosted trees have thebest average AUC performance, followed by bagged trees, neuralnets and SVMs. We then present an ensemble selectionmethod that yields even better AUC. Ensembles are builtwith forward stepwise selection, the model that maximizesensemble AUC performance being added at each step. Theproposed method builds ensembles that outperform the bestindividual model on all the seven test problems.
การแปล กรุณารอสักครู่..
