Naive-Bayes (Good 1965; Langley, Iba, & Thompson
1992) uses Bayes rule to compute the probability
of each class given the instance, assuming the attributes
are conditionally independent given the label.
The version of Naive-Bayes we use in our experiments
was implemented in MLC++ (Kohavi et
al. 1994). The data is pre-discretized using the
an entropy-based algorithm (Fayyad & Irani 1993;
Dougherty, Kohavi, & Sahami 1995). The probabilities
are estimated directly from data based directly
on counts (without any corrections, such as Laplace or
m-estimates).