Here we apply the boosting algorithm AdaBoost to generate
different tree [13]. It works by repeatedly running
a given weak learning algorithm (e.g., a simple decision
tree) on various distributions over the training data, and then
combining the classifiers produced by the weaker learner
into a single composite classifier. It should be noted that
any algorithm can not guarantee to find all good different
trees.