Programming and Problem Solving with Java

(やまだぃちぅ) #1

(^452) | Exceptions and Additional Control Structures


Analysis of Algorithms


If you are given the choice of cleaning a room with a toothbrush or a broom, you probably would
choose the broom. Using a broom sounds like less work than using a toothbrush. True, if the
room is located in a doll house, it may be easier to use the toothbrush, but in general a broom is
the faster way to clean. If you are given the choice of adding numbers together with a pencil and
paper or a calculator, you would probably choose the calculator because it is usually less work. If
you are given the choice of walking or driving to a meeting, you would probably choose to drive;
it sounds like less work.
What do these examples have in common? And what do they have to do with computer sci-
ence? In each of the settings mentioned, one of the choices seems to involve significantly less
work. Measuring the amount of work precisely is difficult in each case, however, because there
are unknowns. How large is the room? How many numbers must be added? How far away is the
meeting? In each case, the unknown information relates to the size of the problem. With an es-
pecially small problem (for example, adding 2 plus 2), our original guess at which approach to
take (using the calculator) might be wrong. Of course, our intuition is usually correct, because
most problems are reasonably large.
In computer science, we need a way to measure the amount of work done by an algorithm
relative to the size of a problem, because usually more than one algorithm is available to solve
any given problem. We often must choose the most efficient algorithm—that is, the algorithm
that does the least work for a problem of a given size.
The amount of work involved in executing an algorithm relative to the size of the problem is
the algorithm’s complexity. We would like to be able to look at an algorithm and determine its
complexity. Then we could take two algorithms that perform the same task and determine
which completes the task faster (requires less work).
How do we measure the amount of work required to execute an algorithm? We use the total
number ofstepsexecuted as a measure of work. One statement, such as an assignment,
may require only one step; another statement, such as a loop, may require many steps.
We define astepas any operation roughly equivalent in complexity to a comparison, an
I/O operation, or an assignment.
Given an algorithm with just a sequence of simple statements (no branches or
loops), the number of steps performed is directly related to the number of statements.
When we introduce branches, however, it becomes possible to skip some statements in
the algorithm. Branches allow us to subtract steps without physically removing them
from the algorithm because only one branch executes at a time. Because we usually
want to express work in terms of the worst-case scenario, we use the number of steps
in the longest branch.
Now consider the effect of a loop. If a loop repeats a sequence of 15 simple statements 10
times, it performs 150 steps. Loops allow us to multiply the work done in an algorithm without
physically adding statements.
Now that we have a measure for the work done in an algorithm, we can compare algorithms.
Suppose, for example, that Algorithm A always executes 3,124 steps and Algorithm B always
does the same task in 1,321 steps. Then we can say that Algorithm B is more efficient—that is, it
takes fewer steps to accomplish the same task.

Complexity A measure of the
effort expended by the
computer in performing a com-
putation, relative to the size of
the computation
Free download pdf