We analyze experimentally the performance of the four ECL variants on the
non-linearly separable problem described above and on real life propositional
and relational datasets. On the real life datasets, ECL-LSDc is the best performing
system. However, as expected, it is unable to solve the non-linearly
separable problem, while ECL-LSDf is able to solve this problem, but its fine grain initialization of inequalities sometimes leads the system to overfit the
training data.
In general, the results of the experiments indicate that initializing inequalities
using intervals obtained from Fayyad & Irani algorithm and then refining
them during the learning process in order to take into account the possible attribute
interdependencies, provides a robust and effective technique for handling
continuous attributes in evolutionary ILP learning.
The rest of the paper is organized as follows. In the next section we describe
the two types of cut points used as boundaries of inequalities and the operators
for shifting inequalities boundaries employed in the mutation. Next we briefly
overview the ECL system and its four extensions with discretization. In Section
4 we report and discuss the results of experiments. Finally in Section 5 we
conclude with a brief experimental comparison of the best ECL variant with
other inductive learning systems.