(^454) | Exceptions and Additional Control Structures
Now suppose that for each of the Nouter loop iterations the inner loop performs Nsteps (L=
N). Here the formula for the total steps is
(S 2 NN) + (S 1 N) + S 0
or
(S 2 N^2 ) + (S 1 N) + S 0
Because N^2 grows much faster than N(for large values of N), the inner loop term (S 2 N^2 ) ac-
counts for the majority of steps executed and the work done. Thus the corresponding
execution time is essentially proportional to N^2. Mathematicians call this type of formula quad-
ratic. If we have a doubly nested loop, where each loop depends on N, then the complexity
expression is
(S 3 N^3 ) + (S 2 N^2 ) + (S 1 N) + S 0
and the work and time are proportional to N^3 whenever Nis reasonably large. Such a formula is
called cubic.
The following table shows the number of steps required for each increase in the exponent of
N, where Nis a size factor for the problem, such as the number of input values.
NN^0 N^1 N^2 N^3
(Constant) (Linear) (Quadratic) (Cubic)
11 11 1
10 1 10 100 1,000
100 1 100 10,000 1,000,000
1,000 1 1,000 1,000,000 1,000,000,000
10,000 1 10,000 100,000,000 1,000,000,000,000
100,000 1 100,000 10,000,000,000 1,000,000,000,000,000
As you can see, each time the exponent increases by 1, the number of steps is multiplied by an
additional order of magnitude (factor of 10). That is, if Nis made 10 times greater, the work in-
volved in an N^2 algorithm increases by a factor of 100, and the work involved in an N^3 algorithm
increases by a factor of 1,000. To put this idea in more concrete terms, an algorithm with a dou-
bly nested loop, in which each loop depends on the number of data values, takes 1,000 steps for
10 input values and 1 quadrillion steps for 100,000 values. On a computer that executes l billion
instructions per second, the latter case would take more than 10 days to run.
The table also shows that the steps outside the innermost loop account for an insignificant
portion of the total number of steps as Ngets bigger. Because the innermost loop dominates
the total time, we classify the complexity of an algorithm according to the highest order of N
that appears in its complexity expression, called the order of magnitude, or simply the orderof
that expression. Thus we talk about algorithms having “order Nsquared complexity” (or
“cubed” or so on) or we describe them with what is called Big-O notation. In Big-O notation, we
express the complexity by putting the highest-order term in parentheses with a capital O in
front. For example, O(1) is constant time; O(N) is linear time; O(N^2 ) is quadratic time; and O(N^3 )
is cubic time.
やまだぃちぅ
(やまだぃちぅ)
#1