For computer to serve as a problem solving machine, it must be directed what steps to follow in order to get the problem solved. An algorithm is a finite sequence of instructions, each of which has a clear meaning and can be performed with a finite amount of effort in a finite amount of time [1]. Algorithms are paramount in computer programming. An algorithm could be of no use even though it is correct and gives a desired output if the resources like time and storage it needs to run to completion are intolerable.
To say that a problem is solvable algorithmically means, informally, that a computer program can be written that will produce the correct answer for any input if we let it run long enough and allow it as much storage space as it needs [2].
In an algorithm, instructions can be executed any number of times, provided the instructions themselves indicate repetition. However, no matter what the input values may be, an
algorithm terminates after executing a finite number of instructions. Thus, a program is an algorithm as long as it never enters an infinite loop on any input [2].
An algorithm can either be correct or incorrect. A correct algorithm is one that halts with a correct output while an incorrect algorithm halts with an incorrect output or may not halt at all. An algorithm has five important features