The input comes weight of the neural network to double and when some time repeats
the process which becomes worse the output(y) which is a resultant price of input
comes out. The output(y) given in the study data, the desired output(o) and is quite
different. As a result, the neural network (y-o) as a margin of error (e = y-o) when the
error weight in proportion to the renewal of the output layer, and then update the
weight of hidden layer. Weight to renew the opposite direction, the direction is the
direction of the handling of the neural network. As Backpropagation algorithm does
with like this reason. Says again, the control of the neural network is advanced with
direction of input layer → hidden layer → output layer, the studying direction of
weight renewal is advanced at output layer → hidden layer. Uses the incoming data
4 comes to give and 4 phased neural network composition lower part and is same.