3.6 Lower Bounds and Θ\Theta Notation

3.6.1 Lower Bounds

Big-Oh notation describes an upper bound. In other words, big-Oh notation states a claim about the greatest amount of some resource (usually time) that is required by an algorithm for some class of inputs of size nn (typically the worst such input, the average of all possible inputs, or the best such input).

Similar notation is used to describe the least amount of a resource that an algorithm needs for some class of input. Like big-Oh notation, this is a measure of the algorithm’s growth rate. Like big-Oh notation, it works for any resource, but we most often measure the least amount of time required. And again, like big-Oh notation, we are measuring the resource required for some particular class of inputs: the worst-, average-, or best-case input of size nn.

The lower bound for an algorithm (or a problem, as explained later) is denoted by the symbol Ω\Omega, pronounced “big-Omega” or just “Omega”. The following definition for Ω\Omega is symmetric with the definition of big-Oh.

For 𝐓(n)\mathbf{T}(n) a non-negatively valued function, 𝐓(n)\mathbf{T}(n) is in set Ω(g(n))\Omega(g(n)) if there exist two positive constants cc and n0n_0 such that 𝐓(n)cg(n)\mathbf{T}(n) \geq c g(n) for all n>n0n > n_0.

Example: Quadratic algorithm

Assume 𝐓(n)=c1n2+c2n\mathbf{T}(n) = c_1 n^2 + c_2 n for c1c_1 and c2>0c_2 > 0. Then,

c1n2+c2nc1n2 c_1 n^2 + c_2 n \geq c_1 n^2

for all n>1n > 1. So, 𝐓(n)cn2\mathbf{T}(n) \geq c n^2 for c=c1c = c_1 and n0=1n_0 = 1. Therefore, 𝐓(n)\mathbf{T}(n) is in Ω(n2)\Omega(n^2) by the definition.

It is also true that the equation of the example above is in Ω(n)\Omega(n). However, as with big-Oh notation, we wish to get the “tightest” (for Ω\Omega notation, the largest) bound possible. Thus, we prefer to say that this running time is in Ω(n2)\Omega(n^2).

Recall the sequential search algorithm to find a value KK within an array of integers. In the average and worst cases this algorithm is in Ω(n)\Omega(n), because in both the average and worst cases we must examine at least cncn values (where cc is 1/2 in the average case and 1 in the worst case).

3.6.1.1 Alternative definition for Ω\Omega

An alternate (non-equivalent) definition for Ω\Omega is

𝐓(n)\mathbf{T}(n) is in the set Ω(g(n))\Omega(g(n)) if there exists a positive constant cc such that 𝐓(n)cg(n)\mathbf{T}(n) \geq c g(n) for an infinite number of values for nn.

This definition says that for an “interesting” number of cases, the algorithm takes at least cg(n)c g(n) time. Note that this definition is not symmetric with the definition of big-Oh. For g(n)g(n) to be a lower bound, this definition does not require that 𝐓(n)cg(n)\mathbf{T}(n) \geq c g(n) for all values of nn greater than some constant. It only requires that this happen often enough, in particular that it happen for an infinite number of values for nn. Motivation for this alternate definition can be found in the following example.

Assume a particular algorithm has the following behavior:

𝐓(n)={nfor all oddn1n2/100for all evenn0 \mathbf{T}(n) = \left\{ \begin{array}{ll} n & \mbox{for all odd}\ n \geq 1\\ n^2/100 \;& \mbox{for all even}\ n \geq 0 \end{array} \right.

From this definition, n2/1001100n2n^2/100 \geq \frac{1}{100} n^2 for all even n0n \geq 0. So, 𝐓(n)cn2\mathbf{T}(n) \geq c n^2 for an infinite number of values of nn (i.e., for all even nn) for c=1/100c = 1/100. Therefore, 𝐓(n)\mathbf{T}(n) is in Ω(n2)\Omega(n^2) by the definition.

For this equation for 𝐓(n)\mathbf{T}(n), it is true that all inputs of size nn take at least cncn time. But an infinite number of inputs of size nn take cn2cn^2 time, so we would like to say that the algorithm is in Ω(n2)\Omega(n^2). Unfortunately, using our first definition will yield a lower bound of Ω(n)\Omega(n) because it is not possible to pick constants cc and n0n_0 such that 𝐓(n)cn2\mathbf{T}(n) \geq c n^2 for all n>n0n>n_0. The alternative definition does result in a lower bound of Ω(n2)\Omega(n^2) for this algorithm, which seems to fit common sense more closely. Fortunately, few real algorithms or computer programs display the pathological behavior of this example. Our first definition for Ω\Omega generally yields the expected result.

As you can see from this discussion, asymptotic bounds notation is not a law of nature. It is merely a powerful modeling tool used to describe the behavior of algorithms.

3.6.2 Theta Notation

The definitions for big-Oh and Ω\Omega give us ways to describe the upper bound for an algorithm (if we can find an equation for the maximum cost of a particular class of inputs of size nn) and the lower bound for an algorithm (if we can find an equation for the minimum cost for a particular class of inputs of size nn). When the upper and lower bounds are the same within a constant factor, we indicate this by using Θ\Theta (big-Theta) notation. An algorithm is said to be Θ(h(n))\Theta(h(n)) if it is in O(h(n))O(h(n)) and it is in Ω(h(n))\Omega(h(n)). Note that we drop the word “in” for Θ\Theta notation, because there is a strict equality for two equations with the same Θ\Theta. In other words, if f(n)f(n) is Θ(g(n))\Theta(g(n)), then g(n)g(n) is Θ(f(n))\Theta(f(n)).

Because the sequential search algorithm is both in O(n)O(n) and in Ω(n)\Omega(n) in the average case, we say it is Θ(n)\Theta(n) in the average case.

Given an algebraic equation describing the time requirement for an algorithm, the upper and lower bounds always meet. That is because in some sense we have a perfect analysis for the algorithm, embodied by the running-time equation. For many algorithms (or their instantiations as programs), it is easy to come up with the equation that defines their runtime behavior. The analysis for most commonly used algorithms is well understood and we can almost always give a Θ\Theta analysis for them. However, the class of NP-Complete problems all have no definitive Θ\Theta analysis, just some unsatisfying big-Oh and Ω\Omega analyses. Even some “simple” programs are hard to analyze. Nobody currently knows the true upper or lower bounds for the following code fragment.

while n > 1:
    if ODD(n):
        n = 3 * n + 1
    else:
        n = n / 2

While some textbooks and programmers will casually say that an algorithm is “order of” or “big-Oh” of some cost function, it is generally better to use Θ\Theta notation rather than big-Oh notation whenever we have sufficient knowledge about an algorithm to be sure that the upper and lower bounds indeed match. OpenDSA modules use Θ\Theta notation in preference to big-Oh notation whenever our state of knowledge makes that possible. Limitations on our ability to analyze certain algorithms may require use of big-Oh or Ω\Omega notations. In rare occasions when the discussion is explicitly about the upper or lower bound of a problem or algorithm, the corresponding notation will be used in preference to Θ\Theta notation.

3.6.3 Classifying Functions

Given functions f(n)f(n) and g(n)g(n) whose growth rates are expressed as algebraic equations, we might like to determine if one grows faster than the other. The best way to do this is to take the limit of the two functions as nn grows towards infinity,

limnf(n)g(n) \lim_{n \rightarrow \infty} \frac{f(n)}{g(n)}

If the limit goes to \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)) because f(n)f(n) grows faster. If the limit goes to zero, then f(n)f(n) is in O(g(n))O(g(n)) because g(n)g(n) grows faster. If the limit goes to some constant other than zero, then f(n)=Θ(g(n))f(n) = \Theta(g(n)) because both grow at the same rate.

Example: Comparing two functions

If f(n)=n2f(n) = n^2 and g(n)=2nlogng(n) = 2n\log n, is f(n)f(n) in O(g(n))O(g(n)), Ω(g(n))\Omega(g(n)), or Θ(g(n))\Theta(g(n))? Since

n22nlogn=n2logn \frac{n^2}{2n\log n} = \frac{n}{2\log n}

we easily see that

limnn22nlogn= \lim_{n \rightarrow \infty} \frac{n^2}{2n\log n} = \infty

because nn grows faster than 2logn2\log n. Thus, n2n^2 is in Ω(2nlogn)\Omega(2n\log n).

3.6.4 Practice questions: Lower bounds

Answer TRUE or FALSE.

Big-Theta notation (Θ\Theta) defines an equivalence relation on the set of functions.

  • An equivalence relation is a relation that is reflexive, symmetric, and transitive.
  • Big-Theta notation is like ==.
  • == is an equivalence relation.

Answer TRUE or FALSE.

The Sequential Search algorithm is Θ(n2)\Theta(n^2).

  • Recall that Θ(n2)\Theta(n^2) means that all program inputs (beyond a certain, small size) run within a constant factor of n2n^2.
  • Theta means that the program is not growing too much slower, nor too much faster, than the claimed growth rate.
  • Sequential search grows much slower than n2n^2.

Determine the proper relationship between the following pair of functions.

f(n)=logn2g(n)=logn+5 \begin{eqnarray} f(n) &=& \log n^2 \\ g(n) &=& \log n + 5 \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=ng(n)=logn2 \begin{eqnarray} f(n) &=& \sqrt n \\ g(n) &=& \log n^2 \end{eqnarray}

  • f(n)f(n) is Θ(g(n))\Theta(g(n))
  • f(n)f(n) is in O(g(n))O(g(n))
  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=log2ng(n)=logn \begin{eqnarray} f(n) &=& \log^2 n \\ g(n) &=& \log n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=ng(n)=log2n \begin{eqnarray} f(n) &=& n \\ g(n) &=& \log^2 n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=nlogn+ng(n)=logn \begin{eqnarray} f(n) &=& n \log n + n \\ g(n) &=& \log n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=logn2g(n)=(logn)2 \begin{eqnarray} f(n) &=& \log n^2 \\ g(n) &=& (\log n)^2 \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=10g(n)=log10 \begin{eqnarray} f(n) &=& 10 \\ g(n) &=& \log 10 \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=2ng(n)=10n2 \begin{eqnarray} f(n) &=& 2^n \\ g(n) &=& 10 n^2 \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=2ng(n)=nlogn \begin{eqnarray} f(n) &=& 2^n \\ g(n) &=& n \log n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=2ng(n)=3n \begin{eqnarray} f(n) &=& 2^n \\ g(n) &=& 3^n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Determine the proper relationship between the following pair of functions.

f(n)=2ng(n)=nn \begin{eqnarray} f(n) &=& 2^n \\ g(n) &=& n^n \end{eqnarray}

  • if limf(n)g(n)0\lim \frac{f(n)}{g(n)} \rightarrow 0, then f(n)f(n) is in O(g(n))O(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow constant, then f(n)f(n) is Θ(g(n))\Theta(g(n)).
  • if limf(n)g(n)\lim \frac{f(n)}{g(n)} \rightarrow \infty, then f(n)f(n) is in Ω(g(n))\Omega(g(n)).

Which of these is the best lower bound for a growth rate of 5n+35n + 3?

  • The simplifying rules tell us that we can drop constants and lower order terms from a polynomial that defines the growth rate.

For what value of kk is n=Θ(nk)\sqrt n = \Theta(n^k)? (Answer in decimals, not fractions)

  • n=n0.5\sqrt n = n^{0.5}

If we know that algorithm X is Θ(n)\Theta(n) in the average case, then what can we say about its TIGHTEST upper bound?

  • What is the definition of Θ(n)\Theta(n)?
  • Being Θ(n)\Theta(n) means that we know a tight bound (both the upper and the lower bounds match).
  • If X is Θ(n)\Theta(n), then it is both O(n)O(n) and Ω(n)\Omega(n).