The states correspond to each possible vote value.
In the network, each
item will have a set of parent items that are its best predictors. The conditional probability tables are represented by decision trees. The authors report better results for
this model than for several nearest-neighbors implementations over several datasets.
Hierarchical Bayesian Networks have also been used in several settings as a way
to add domain-knowledge for information filtering [78]. One of the issues with hierarchical Bayesian networks, however, is that it is very expensive to learn and update
the model when there are many users in it. Zhang and Koren [79] propose a varia-
tion over the standard Expectation-Maximization (EM) model in order to speed up
this process in the scenario of a content-based RS.