The formal
guarantees provided by the AdaBoost learning procedure are
quite strong. It has been proved in [15] that the training error of
the strong classifier approaches zero exponentially in the
number of rounds. Gentle AdaBoost takes a newton steps for
optimization.