Although this thesis is focusing on combining neural networks, for completeness of this
section we feel it is necessary to mention that a few experiments have been done with
hybrid ensembles. Wang et al [146] combined decision trees with neural networks. They
found that when the neural networks outnumbered the decision trees, but there was at least
1 decision tree, the system performed better than any other ratio. Langdon [82] combines
decision trees with neural networks in an ensemble, and uses Genetic Programming to
evolve a suitable combination rule. Woods et al [150] combines neural networks, k-nearest
neighbour classifiers, decision trees, and Quadratic Bayes classifiers in a single ensemble,
then uses estimates of local accuracy in the feature space to choose one classifier to respond
to a new input pattern.
Although this thesis is focusing on combining neural networks, for completeness of thissection we feel it is necessary to mention that a few experiments have been done withhybrid ensembles. Wang et al [146] combined decision trees with neural networks. Theyfound that when the neural networks outnumbered the decision trees, but there was at least1 decision tree, the system performed better than any other ratio. Langdon [82] combinesdecision trees with neural networks in an ensemble, and uses Genetic Programming toevolve a suitable combination rule. Woods et al [150] combines neural networks, k-nearestneighbour classifiers, decision trees, and Quadratic Bayes classifiers in a single ensemble,then uses estimates of local accuracy in the feature space to choose one classifier to respondto a new input pattern.
การแปล กรุณารอสักครู่..
