Researcher: Shuo wang and xin yao (2009) [55]
Contribution: Explores the impact and overall performance on imbalanced data sets
Techniques: Using UnderBagging, OverBagging and SMOTEBagging
Achievement Both overall performance (g-mean) and diversity degree have improvement
Researcher: Putthiporn and chidchanok. (2013) [2]
Contribution: The location of separating function in each class is defined only by the boundary data.
Techniques: Using Bootstrapping and AdaBoost neural network.
Achievement Proposed method achieves higher accuracy in both minority and majority classes
Researcher: Seiffert et. Al. (2010) [45]
Contribution: A new hybrid sampling/boosting algorithm
Techniques: Using random undersampling, synthetic minority over-sampling technique Using, and Adaboost
Achievement New method called Rusboost. Rusboost significantly outper-forms SMOTEBoost
Researcher: Dittman et. Al. (2015)[5]
Contribution: Investigation three different options for data sampling (no data sampling, RUS with 35:65 post-sampling class distribu-tion ratio, and 50:50 post-sampling class distribution ratio)
Techniques: Using random undersampling and random forest
Achievement Data sampling does improve the classification performance of random forest
Researcher: José a. Sáez et. Al. (2015) [10]
Contribution: Reduces the noise and makes the class boundaries more reg-ular
Techniques: Using re-sampling method with filtering
Achievement New method called SMOTE–IPF. The SMOTE–IPF per-forms better other re-sampling methods