In the three algorithms discussed above, one of the
key differences is in the mechanism to produce a new
population of solutions via perturbation of solutions
from the old population. These different mechanisms generate
a population of solutions with different balance
between intensification and diversification. This dynamic
behavior of the population can be deducted from
the basic perturbation method used in the creation of new
solutions. This section discusses the three algorithms
based on two aspects: intensification and diversification.
The discussion will be made algorithm by algorithm.
Suppose that the same solution representation is
used and the initial population is exactly the same. It
should be noted that all evolutionary algorithms may
require a decoding process and checking of constraints
to ensure that the solutions are feasible.
For GA, the solutions are ranked based on the fitness
values. The parents are selected based on probabilities
that favor individuals with better fitness. The crossover
operation produces offspring with parts taken from
the parents and the solutions are more likely to be similar
to the parents. Based on this observation, GA tends
to generate solutions that are more likely to cluster
around several “good” solutions in the population. The diversification aspect of GA is accomplished through
the mutation operation that injects some “difference”
into the solutions from time to time. The solution time of
GA also increases non-linearly as the population size
increases because of the required sorting.
For PSO, a new swarm of particles is generated via
the velocity and position update equations. This ensures
that all new particles can be much different than the old
ones. Also, since the mechanism is based on the floating
point arithmetic, it could generate any potential values
within the solution space, i.e., the density of the solutions
within the solution space may be much higher than
those generated via GA. In other words, the solutions
can be much closer to each other than solutions in GA.
In addition, the best particle in the swarm exerts its oneway
influence over all the remaining solutions in the
population. This often leads to premature clustering around
the best particle, especially if the fitness gaps are large.
Similar to PSO, since the mechanism to generate
new solutions of DE is also based on the floating point
arithmetic, the exploration ability of the population
might be comparable to PSO, but the diversification is
better because the best solution in the population does
not exert any influence on the other solutions in the
population. Furthermore, the mutant vector is always a
solution that is not from the original population; therefore,
the crossover operation in DE is always between a
solution from the population and a newly generated one.
For any evolutionary algorithm, the solutions are
gradually clustered around one or more “good” solutions
as the search evolves. This clustering can be seen as the
convergence of the population toward a particular solution.
If the population clusters very quickly, the population
may become stagnated and any further improvement
becomes less likely.