There is of course a lot of prior work that has tried
to improve the performance of naive Bayes. Usually
these approaches address the main weakness in naive
Bayes-the independence assumption-either explicitly
by directly estimating dependencies, or implicitly
by increasing the number of parameters that are estimated.
Both approaches allow for a tighter fit of the
training data.
Typically the independence assumption is relaxed in
a way that still keeps the computational advantages
of pure naive Bayes. Two such methods are treeaugmented
naive Bayes (Friedman et a!., 1997) and
AODE ( Webb et a!., 2003). Both enable some attribute
dependencies to be captured while still being
computationally efficient.
Some alternative approaches try to transform the original
problem to a form that allows for the correct treatment
of some of the dependencies. Both semi-naive
Bayes ( Kononenko, 1991) and the Cartesian product
method ( Pazzani, 1996) are such transformation-based
attempts for capturing pairwise dependencies.