Another interesting
feature of GAs is that they are inherently parallel. Instead of
using a single point and local gradient information, they evolve
a population of candidate solutions where each individual represents a specific solution not related to other solutions. Therefore, their application to large-scale optimization problems can
be easily implemented on parallel machines resulting in a significant reduction of the required computation time. GAs are
generally considered as offline optimization algorithms due to the large amount of the CPU time required to converge to an optimal solution. However, their performance can be significantly
improved when a suitable combination of the basic genetic with
other problem-specific operators is employed.