Discrete Mathematics for Computer Science

(Romina) #1

308 CHAPTER 5 Analysis of Algorithms


Proof. This proof is left as an exercise for the reader. U

One might ask how much more efficient BubbleSort can be made. Further improve-
ments are possible. For example, if for an entire pass no calls to Swap have been made, then
the list is in order, and BubbleSort may stop. This modification makes some varieties of
BubbleSort very efficient for lists that are nearly in order before the sorting starts. However,
BubbleSort is inherently slow for arbitrary lists. More complicated analyses show that:

"* If a list of N elements is ordered as in Example 1(a), any BubbleSort must make at least
N • (N - 1)/2 calls to Swap. So, although such algorithms run faster on "nice" input,
they run no faster on the "worst" input.
"* If one considers all possible original orderings of a list of N distinct objects, then on
average, any BubbleSort must make N • (N - 1)/4 calls to Swap, so the average is no
better than half the worst-case time.

We discussed earlier one standard way to count how many key operations are per-
formed. For the moment, the reader just needs to be aware that this is not the only way.
Note that in our discussions of the complexity of selection (see Section 5.3.2) and of rep-
etition (see Section 5.3.3), we concluded only that the number of steps taken by the entire
algorithm was at most something. Some other argument might tell us that the number is,
indeed, far less. Return to the discussion of repetition: Sometimes, it turns out that al-
though in the i th time through a loop we might execute a large number Fi (Bi) of key
operations, most of the times through the loop, on any given input data set, we execute far
fewer than Fi (Bi) key operations. In this case, the total time taken by the repetition may
be far lower than the bound that we calculated. We will not discuss this issue further at this
point.

5.3.5 Time Complexity of an Algorithm

After the preliminaries that show us how functions correspond to a way of measuring the
number of key operations executed in an algorithm, we return to the original goal of mea-
suring the time complexity of an algorithm. We will again use BubbleSort as an example-
both because it's easy to analyze and because the reader will be able to understand the
analysis without having to understand more complex code. The goal is to measure time
consumption not as a function of the particular input but, rather, as a function of the size
of the input.
Time complexity is measured by the number of times that the key operations are per-
formed, but how is the size of the input measured? For sorting algorithms, and for many

other algorithms, size is usually taken to be the number of input data-that is, the number

of data to be sorted.
For a fixed input size n and a fixed set of key operations, we need to consider all pos-
sible input sets of that size. We must count the number of times that key operations are
executed by the algorithm on each possible set of size n. If there is a maximum possi-
ble (largest) number of steps, then that number is called the worst-case behavior of the
algorithm on input size N. If for each input size N there is a maximum possible num-
ber worst(n) of steps, then the function worst is called the worst-case complexity of
the algorithm. For example, the partially optimized BubbleSort presented earlier makes
N • (N - 1)/2 comparisons and at worst N • (N - 1)/2 calls to Swap.
Free download pdf