In many situations, data structures are subject to a sequence of operations rather than one operation. In this sequence, one operation possibly performs certain modifica- tions that have an impact on the run time of the next operation in the sequence. One way of assessing the worst case run time of the entire sequence is to add worst case ef- ficiencies for each operation. But this may result in an excessively large and unrealis- tic bound on the actual run time. To be more realistic, amortized analysis can be used to find the average complexity of a worst case sequence of operations. By analyzing sequences of operations rather than isolated operations, amortized analysis takes into account interdependence between operations and their results. For example, if an ar- ray is sorted and only a very few new elements are added, then re-sorting this array should be much faster than sorting it for the first time because, after the new addi- tions, the array is nearly sorted. Thus, it should be quicker to put all elements in per- fect order than in a completely disorganized array. Without taking this correlation into account, the run time of the two sorting operations can be considered twice the worst case efficiency. Amortized analysis, on the other hand, decides that the second sorting is hardly applied in the worst case situation so that the combined complex- ity of the two sorting operations is much less than double the worst case complexity. Consequently, the average for the worst case sequence of sorting, a few insertions, and sorting again is lower according to amortized analysis than according to worst case analysis, which disregards the fact that the second sorting is applied to an array operated on already by a previous sorting.