Firstly of all, in our current study, we compared the
ADASYN algorithm to single decision tree and SMTOE
algorithm [15] for performance assessment. This is mainly
because all of these methods are single-model based learning
algorithms. Statistically speaking, ensemble based learning algorithms
can improve the accuracy and robustness of learning
performance, thus as a future research direction, the ADASYN
algorithm can be extended for integration with ensemble
based learning algorithms. To do this, one will need to use
a bootstrap sampling technique to sample the original training
data sets, and then embed ADASYN to each sampled set to
train a hypothesis. Finally, a weighted combination voting rule
similar to AdaBoost.M1 [35] [36] can be used to combine
all decisions from different hypotheses for the final predicted
outputs. In such situation, it would be interesting to see the
performance of such boosted ADASYN algorithm with those
of SMOTEBoost [16], DataBoost-IM [17] and other ensemble