This approach has several important properties. First, the
Gaussian Process is a Bayesian method. Thus, the integration
of this form of regression into the Bayesian framework
of model selection is natural and fairly straightforward.
This allows us to interpret the results of the learning
as posterior probabilities, and to assess the posterior probability
of various networks structures (e.g., using methods
such as [9]). Second, the semi-parametric nature of the
prior allows to learn many continuous functional dependencies.
This is crucial for exploratory data analysis where
there is little prior knowledge on the form of interactions
we may encounter in data. In addition, the Gaussian Process
is biased to find functional dependencies among the
variables in the domain. This is a useful prior for domains
where we believe there is a direct causal dependency between
attributes.
In the remainder of this paper we review the Bayesian approach
for learning Bayesian networks. We then review
the definition of the Gaussian process prior in this setting
and discuss how to combine the two to learn networks. Finally,
we validate our approach on series of artificial examples
that test its generalization capabilities and apply to few
real-life data problems.