Given a specific problem to solve, the input to the GA is a set
of solutions to that problem, called genomes, and a metric called afitness function that returns a quantitative measure of the quality of a
genome. GA, through a series of iterations, called generations, tries to
improve the genomes through genetic operations such as crossover
and mutation. The main advantages of GA are that it does not require
gradient information and therefore it is capable of solving nonlinear
problems with multiple local optima. Further, GA has been demonstrated
to produce substantial improvement over random and local
search methods [11]. Since GA evolves its solutions over many iterations,
it tends to be computationally expensive. Yet, in comparison
with brute-force search, it is faster. In practice, it was found that
evolving 30 genomes over 100 generations yielded stable solutions