In order to generate ensembles, several approaches are possible. The two most
common techniques are Bagging and Boosting. In Bagging, we perform sampling
with replacement, building the classifier on each bootstrap sample. Each sample has
probability (1− N )N of being selected – note that if N is large enough, this converges
to 1 − 1 ≈ 0.623. In Boosting we use an iterative procedure to adaptively change
distribution of training data by focusing more on previously misclassified records.
Initially, all records are assigned equal weights. But, unlike bagging, weights may
change at the end of each boosting round: Records that are wrongly classified will
have their weights increased while records that are classified correctly will have
their weights decreased. An example of boosting is the AdaBoost algorithm.