In the first neuron layer, the tan-sigmoid transfer function, which was a nonlinear transfer function, generated output
values between -1 to 1 as the net input of the neuron ranged from negative to positive infinity [28]. The second neuron layer contained one linear transfer function that could distinguish and designate any output values from the first neuron layer as the two output results of malignant breast cancer and benign disease in mammographic data classification. Both the linear and nonlinear transfer functions did not have minima since the linear and nonlinear transfer functions were differentiable and monotonically increasing functions. Thus, this would tend to preclude error minima from trapping the neural network classification training model when it learned.
A Back-Propagation