3.13 Growth Rates Review
Note: This section is work in progress
Two functions of have different growth rates if as goes to infinity their ratio either goes to infinity or goes to zero.
Figure: Growth rates
Two views of a graph illustrating the growth rates for six equations. The bottom view shows in detail the lower-left portion of the top view. The horizontal axis represents input size. The vertical axis can represent time, space, or any other measure of cost.
Where does go on here?
Exact equations relating program operations to running time require machine-dependent constants. Sometimes, the equation for exact running time is complicated to compute. Usually, we are satisfied with knowing an approximate growth rate.
Example: Given two algorithms with growth rate and , do we need to know the values of and ?
Consider and . PROVE that must eventually become (and remain) bigger.
Proof by Contradiction: Assume there are some values for constants and such that, for all values of , . Then, . But, as grows, what happens to ? It goes to zero.
Since grows toward infinity, the assumption must be false. Conclusion: In the limit, as , constants don’t matter. Limits are the typical way to prove that one function grows faster than another.
Here are some useful observatios.
Since grows faster than ,
- grows faster than . (Take antilog of both sides.)
- grows faster than . (Square boths sides.)
- grows faster than . (. Replace with .)
- grows no slower than . (Take of both sides. Log “flattens” growth rates.)
Since grows faster than ,
- grows faster than . (Apply factorial to both sides.)
- grows faster than . (Take antilog of both sides.)
- grows faster than . (Square both sides.)
- grows faster than . (Take square root of both sides.)
- grows no slower than . (Take log of both sides. Actually, it grows faster since .)
If grows faster than , then must grow faster than ? Yes.
Must grow faster than ? No. within a constant factor, that is, the growth rate is the same!
is related to in exactly the same way that is related to .
.
3.13.1 Asymptotic Notation
Name | Notation | Definition |
---|---|---|
Little-O | ||
Big-O | ||
Theta | ||
Big-Omega | ||
Little-Omega |
I prefer “” to “” While and , .
Note: Big oh does not say how good an algorithm is – only how bad it can be.
If and , is better than ? Perhaps... but perhaps better analysis will show that while .
Order Notation has practical limits. Notation: vs. vs. .
.
.
.
Statement: Resource requirements for Algorithm grow slower than resource requirements for Algorithm .
Is better than ?
Potential problems:
- How big must the input be?
- Some growth rate differences are trivial Example: vs. . If is then , even though grows faster than . must be enormous (like ) for to be bigger than .
It is not always practical to reduce an algorithm’s growth rate “Practical” here means that the constants might become too much higher when we shave off the minor asymptotic growth.
Shaving a factor of reduces cost by a factor of a million for input size of a million. Shaving a factor of saves only a factor of 4-5.
There is the concept of a “Practicality Window”. In general, (1) we have limited time to solve a problem, and (2) input can only get so big before the computer chokes. Fortunately, algorithm growth rates are USUALLY well behaved, so that Order Notation gives practical indications. “Practical” is the keyword. We use asymptotics because they provide a simple model that usually mirrors reality. This is useful to simplify our thinking.