In this system, a variant of AdaBoost, Gentle AdaBoost is used to select the features and to train the classifier. The formal
guarantees provided by the AdaBoost learning procedure are quite strong. It has been proved in that the training error of the strong classifier approaches zero exponentially in the number of rounds. Gentle AdaBoost takes a newton steps for optimization.