In this paper the heterogeneous distance function was used with a probabilistic neural network for
classification, which allowed very fast training at the cost of a large, static network. However, this function
is appropriate for a wide range of basis function networks that use distance functions.
Current research is seeking to test the heterogeneous distance function on a variety of other models,
including various Radial Basis Function networks and instance-based machine learning systems. The
normalization factors are also being examined to see if they provide the best possible normalization.
In addition, it appears that some data which is tagged as “nominal” is often somewhat ordered. It is
hypothesized that if the values of nominal attributes are randomly rearranged then the HRBF would perform
about the same (since it does not depend on the ordering of nominal values), but that the homogeneous RBF
would suffer a loss in accuracy. The accuracy of this hypothesis and the severity of the loss in accuracy are
currently being explored.