Gradient Boosted Decision Tree is an additive decision tree
algorithm in which a series of boosted decision trees are
created and additively form a forest as a single predictive
model [17]. It uses decision tree as a weak learner to build
an additive predictive model on re-weighted data [17].
GBDT is also called as wrapper approach in which a
decision tree treated as a base learner or weak learner. An
additive wrapping is done on base learner. The advantages
of GBDT are - first, base learner can be changed to other
learner with same wrapper; second, initialization of
predictor variables is not needed unlike Ada-boost. The
disadvantage of GBDT is that the boosting wrapper teats
decision tree as a black box and it works on tree
optimization rather than forest optimization.