3.13 Growth Rates Review

Note: This section is work in progress

Two functions of nn have different growth rates if as nn goes to infinity their ratio either goes to infinity or goes to zero.

Figure: Growth rates

The growth rates for five equations

Two views of a graph illustrating the growth rates for six equations. The bottom view shows in detail the lower-left portion of the top view. The horizontal axis represents input size. The vertical axis can represent time, space, or any other measure of cost.

Where does (1.618)n(1.618)^n go on here?

Exact equations relating program operations to running time require machine-dependent constants. Sometimes, the equation for exact running time is complicated to compute. Usually, we are satisfied with knowing an approximate growth rate.

Example: Given two algorithms with growth rate c1nc_1n and c22n!c_2 2^{n!}, do we need to know the values of c1c_1 and c2c_2?

Consider n2n^2 and 3n3n. PROVE that n2n^2 must eventually become (and remain) bigger.

Proof by Contradiction: Assume there are some values for constants rr and ss such that, for all values of nn, n2<rn+sn^2 < rn + s. Then, n<r+s/nn < r + s/n. But, as nn grows, what happens to s/ns/n? It goes to zero.

Since nn grows toward infinity, the assumption must be false. Conclusion: In the limit, as nn \rightarrow \infty, constants don’t matter. Limits are the typical way to prove that one function grows faster than another.

Here are some useful observatios.

Since n2n^2 grows faster than nn,

Since n!n! grows faster than 2n2^n,

If ff grows faster than gg, then must f\sqrt{f} grow faster than g\sqrt{g}? Yes.

Must logf\log f grow faster than logg\log g? No. lognlogn2\log n \approx \log n^2 within a constant factor, that is, the growth rate is the same!

logn\log n is related to nn in exactly the same way that nn is related to 2n2^n.

2logn=n2^{\log n} = n.

3.13.1 Asymptotic Notation

Name Notation Definition
Little-O f(n)o(g(n))f(n) \in o(g(n)) |f(n)|<kg(n)|f(n)| < k\cdot g(n)
Big-O f(n)O(g(n))f(n) \in O(g(n)) |f(n)|kg(n)|f(n)| \leq k\cdot g(n)
Theta f(n)Θ(g(n))f(n) \in \Theta(g(n)) k1g(n)|f(n)|k2g(n)k_1\cdot g(n) \leq |f(n)| \leq k_2\cdot g(n)
Big-Omega f(n)Ω(g(n))f(n) \in \Omega(g(n)) f(n)kg(n)f(n) \geq k\cdot g(n)
Little-Omega f(n)ω(g(n))f(n) \in \omega(g(n)) f(n)>kg(n)f(n) > k\cdot g(n)

I prefer “fO(n2)f \in O(n^2)” to “f=O(n2)f = O(n^2)” While nO(n2)n \in O(n^2) and n2O(n2)n^2 \in O(n^2), O(n)O(n2)O(n) \neq O(n^2).

Note: Big oh does not say how good an algorithm is – only how bad it can be.

If 𝒜O(n)\mathcal{A}\in O(n) and O(n2)\mathcal{B} \in O(n^2), is 𝒜\mathcal{A} better than \mathcal{B}? Perhaps... but perhaps better analysis will show that 𝒜=Θ(n)\mathcal{A} = \Theta(n) while =Θ(logn)\mathcal{B} = \Theta(\log n).

Order Notation has practical limits. Notation: logn2(=2logn)\log n^2 (= 2 \log n) vs. log2n(=(logn)2)\log^2 n (= (\log n)^2) vs. loglogn\log \log n.

log162=2log16=8\log 16^2 = 2 \log 16 = 8.

log216=42=16\log^2 16 = 4^2 = 16.

loglog16=log4=2\log \log 16 = \log 4 = 2.

Statement: Resource requirements for Algorithm 𝒜\mathcal{A} grow slower than resource requirements for Algorithm \mathcal{B}.

Is 𝒜\mathcal{A} better than \mathcal{B}?

Potential problems:

  • How big must the input be?
  • Some growth rate differences are trivial Example: Θ(log2n)\Theta(\log^2 n) vs. Θ(n1/10)\Theta(n^{1/10}). If nn is 1012(240)10^{12} (\approx 2^{40}) then log2n1600\log^2 n \approx 1600, n1/10=16n^{1/10} = 16 even though n1/10n^{1/10} grows faster than log2n\log^2 n. nn must be enormous (like 21502^{150}) for n1/10n^{1/10} to be bigger than log2n\log^2 n.

It is not always practical to reduce an algorithm’s growth rate “Practical” here means that the constants might become too much higher when we shave off the minor asymptotic growth.

Shaving a factor of nn reduces cost by a factor of a million for input size of a million. Shaving a factor of loglogn\log \log n saves only a factor of 4-5.

There is the concept of a “Practicality Window”. In general, (1) we have limited time to solve a problem, and (2) input can only get so big before the computer chokes. Fortunately, algorithm growth rates are USUALLY well behaved, so that Order Notation gives practical indications. “Practical” is the keyword. We use asymptotics because they provide a simple model that usually mirrors reality. This is useful to simplify our thinking.