Bayesian classifiers as Naive Bayes [11] or Tree Augmented Naive Bayes (TAN)
[7] have shown excellent performance in spite of their simplicity and heavy
underlying independence assumptions.
In our opinion, the TAN classifier, as presented in [7], has two weak points:
not taking into account model uncertainty and lacking a theoretically well
founded explanation for the use of softening of the induced model parameters
(see section 2.2).
In [3] an alternative classifier based on empirical local Bayesian model averaging
was proposed as a possible improvement for the first weak point. Furthermore,
in [4] the fact that decomposable distributions over TANs allow the
tractable calculation of the model averaging integral was used to construct sstbmatan,
a classifier that takes into account model uncertainty in a theoretically
well founded way and that provides improved classification accuracy.