In each instance the individual ( ) Akk x µ
membership functions may be adjusted during the training phase to achieve minimum classification error.
7 Training Training is the distinctive feature of the somewhat heuristic model presented here. During the training stage the standard deviation used in each membership function can be adjusted to improve overall performance of the “training” partition data. Increasing individual standard deviation has the effect of increasing the width of that membership exponential function. The function now departs from the true statistical probability function, but the choice of the normal probability function for the exponential was arbitrary and not usually used in fuzzy logic anyway. The data represents physical attributes and such attributes might be assumed to have normal distributions. However the data was obtained from human observations that are quantized and hence not normal.
The unaided model performed quite well. However performance was improved by adjusting the value used for standard deviation of each individual parameter associated with each class.
The additive method of combining individual parameter memberships and the product method performed comparably in practice. It would seem that using the squares of membership values would emphasize parameters where there is “good” membership. However empirical work suggests that there was no significant difference.
An adjusting coefficient may be used when adding membership values. This allows a small amount of improvement in some instances.
With some data mining models over training is possible. In that case the error percentage is very good for the training partition but markedly degraded for the validation set. This situation did not present itself.
8 Training Optimization There are many parameters and each one has an adjustable standard deviation. Adjusting by eye is time consuming. A genetic algorithm was used in an attempt to optimize this process. The genetic algorithm used was a Microsoft Excel add-in called xl bit (http://www.xlpert.com) [15]