Worst-case, best-case and average-case analysis of sorting
algorithms
The worst-case occurs in a sorting algorithm when the elements to
be sorted are in reverse order. The best-case occurs when the
elements are already sorted. The average–case may occur when
part of the elements are already sorted. The average-case has data
randomly distributed in the list (William and William, 2002). The
average case may not be easy to determine in that it may not be
apparent what constitutes an ‘average’ input. Concentration is
always on finding only the worst-case running time for any input of
size n due to the following reasons (Thomas et al., 2003):
i.) The worst-case running time of an algorithm is an upper bound
on the running time for any input. Knowing it gives us a guarantee
that the algorithm will never take any longer. We need not make
some educated guess about the running time and hope that it never
gets much worse.
ii.) For some algorithms, the worst-case occurs fairly often. For
example, in searching a database for a particular piece of
information, the searching algorithm’s worst-case will often occur
when the information is not present in the database. In some
searching applications, searches for absent information may be
frequent.
iii.) The “average-case” is often roughly as bad as the worst case.
Analysis of the proposed algorithm
Generally, the running time of a sorting algorithm is proportional to
the number of comparisons that the algorithm uses, to the number
of times items are moved or exchanged, or both (Robert, 1998).
The approach used in this paper is to measure the number of
comparisons and exchanges carried out by each algorithm
(Batcher’s Sort, Bitonic Sort and Oyelami’s Sort) in the worst case
scenario.
RESULTS AND DISCUSSION
Table 1 shows the result obtained. From the results in
Table 1, the proposed algorithm has fewer numbers of
comparisons and swaps compared with both Batcher’s
Odd-Even Sort and Bitonic Sort. The results also show
that as the size of the input increases, the proposed
algorithm tends to be more efficient as both Batcher’s
Odd-Even and Bitonic sorts are not good for large values
of input. The implication of these is that the proposed
algorithm is faster and therefore, more efficient. The algorithm
is also recommended for large values of inputs to
be sorted.
Conclusion
Bubble Sort is not known to be a good algorithm because
it is a quadratic-time sorting algorithm. However, efforts
have been made to improve the performance of the
algorithm. With Bidirectional Bubble Sort, the average
number of comparisons is slightly reduced and Batcher’s
Sort similar to Shellsort also performs significantly better
than Bidirectional Bubble Sort by carrying out comparisons
in a novel way so that no propagation of exchange
Is necessary. This paper has further improved on BatchSci.
Res. Essays 744
Worst-case, best-case and average-case analysis of sorting
algorithms
The worst-case occurs in a sorting algorithm when the elements to
be sorted are in reverse order. The best-case occurs when the
elements are already sorted. The average–case may occur when
part of the elements are already sorted. The average-case has data
randomly distributed in the list (William and William, 2002). The
average case may not be easy to determine in that it may not be
apparent what constitutes an ‘average’ input. Concentration is
always on finding only the worst-case running time for any input of
size n due to the following reasons (Thomas et al., 2003):
i.) The worst-case running time of an algorithm is an upper bound
on the running time for any input. Knowing it gives us a guarantee
that the algorithm will never take any longer. We need not make
some educated guess about the running time and hope that it never
gets much worse.
ii.) For some algorithms, the worst-case occurs fairly often. For
example, in searching a database for a particular piece of
information, the searching algorithm’s worst-case will often occur
when the information is not present in the database. In some
searching applications, searches for absent information may be
frequent.
iii.) The “average-case” is often roughly as bad as the worst case.
Analysis of the proposed algorithm
Generally, the running time of a sorting algorithm is proportional to
the number of comparisons that the algorithm uses, to the number
of times items are moved or exchanged, or both (Robert, 1998).
The approach used in this paper is to measure the number of
comparisons and exchanges carried out by each algorithm
(Batcher’s Sort, Bitonic Sort and Oyelami’s Sort) in the worst case
scenario.
RESULTS AND DISCUSSION
Table 1 shows the result obtained. From the results in
Table 1, the proposed algorithm has fewer numbers of
comparisons and swaps compared with both Batcher’s
Odd-Even Sort and Bitonic Sort. The results also show
that as the size of the input increases, the proposed
algorithm tends to be more efficient as both Batcher’s
Odd-Even and Bitonic sorts are not good for large values
of input. The implication of these is that the proposed
algorithm is faster and therefore, more efficient. The algorithm
is also recommended for large values of inputs to
be sorted.
Conclusion
Bubble Sort is not known to be a good algorithm because
it is a quadratic-time sorting algorithm. However, efforts
have been made to improve the performance of the
algorithm. With Bidirectional Bubble Sort, the average
number of comparisons is slightly reduced and Batcher’s
Sort similar to Shellsort also performs significantly better
than Bidirectional Bubble Sort by carrying out comparisons
in a novel way so that no propagation of exchange
Is necessary. This paper has further improved on BatchSci.
Res. Essays 744
การแปล กรุณารอสักครู่..
