One of the most common neural network topologies is
the multilayer perceptron(MLP). Its robustness has been
proven in many classification tasks[28-30]. A MLP is a
feedforward artificial neural network consisting of a
series of layers. Each neuron in the layers is connected
to the neurons of the subsequent layer, and neurons sum
up their inputs passing the results through activation
functions[31]. Figure 4 shows the MLP topology used in
this work. As can be seen in previous studies, a single
hidden layer containing sufficient number of neurons can
satisfactorily perform the prediction of most complex
problems[32,33]. In this research, a hidden layered MLP
topology was chosen, and implemented using the
‘patternnet’ function in Matlab (2012a, The MathWorks,
Inc., Natick, Massachusetts). This function uses hyperbolic
tangent sigmoid transfer function (Equation (3)) to
compute a layer's output from its input. Therefore,
hyperbolic tangent sigmoid transfer functions were used
in the hidden and output layers of the MLP NN.