Here are suggested solutions to the exercises about complexity, parts B and C.
(We assume that the dynamic array does not resize automatically after deletion, which is in fact how ArrayList behaves in Java.)
All cases have the same hypotheses. These give us:
Then we have:
g1. By induction, we see that T(n) = n, so T(n) ∈ O(n) (linear complexity). The example is a recursive presentation of f1 above.
g2. We have T(0) = 0 and T(n + 1) = T(n) + n. An exact solution is T(n) = n(n+1)/2 (proved by induction), so T(n) ∈ O(n2) (quadratic complexity). Alternatively, one can reason T(n) = 1 + 2 + … + n ≤ n ∙ n = n2 ∈ O(n2). This is sharp because T(n) = 1 + 2 + … + n ≥ ⌈(n+1)/2⌉ + … + n ≥ n/2 ∙ n/2 = 1/4 n2.
g3. This is the skeleton of binary search. The function computes T(n) = ⌊log2(n)⌋ (how many times do we have divide n by 2 until the result is less than 2?). Thus, T(n) ∈ O(log(n)) (logarithmic complexity).
g4. This is the skeleton of quick select. We have T(1) = 1 and T(n) = n + T(n/2) for n > 1. For n = 2m, we have T(2m) = 2m + T(2m–1) = 2m + 2m–1 + T(2m–2) = … = 2m + … + 2 + 1 + T(1) = 2m+ (by induction or the formula for geometric sums), that is, T(n) = n. For the general case, let n’ be the largest power of 2 bounded by n. T is increasing, so we have T(n’) ≤ T(n) ≤ T(2n’), so n/2 ≤ T(n) ≤ 2n, which gives T(n) ∈ O(n) (linear complexity) strictly.
g5. We have T(0) = 1 and T(n + 1) = 2 T(n), so by induction we have T(n) = 2n ∈ O(2n) (exponential complexity with base 2).
g6. This is the skeleton of merge sort. We have T(1) = 1 and T(n) = n + 2 T(n/2) for n > 1. For n = 2m, we have T(2m) = 2m + 2T(2m–1) = 2m + 2m + 4T(2m–2) = … = 2m + … + 2m + 2m T(1) = 2m ∙ (m + 1), that is, T(n) = n (log2</sup>(n) + 1). For the general case, let n’ be the largest power of 2 bounded by n. T is increasing, so we have T(n’) ≤ T(n) ≤ T(2n’), so n/2 (log2(n/2) + 1) ≤ T(n) ≤ 2n (log2(2n) + 1), which is n/2 log2(n) ≤ T(n) ≤ 2n (log2(n) + 2), which gives T(n) ∈ O(n log(n)) strictly.
There’s no answer since this wasn’t really a question…
Some of the problems have answers or hints on the website, otherwise you’re on your own :).
The first statement is false. To see this, put f(n) = 2n and g(n) = n. We have 2n ∈ O(n), but not 22n ∈ O(2n). Indeed, if 22n ∈ O(2n) were to hold, there would be C such that 22n ≤ C 2n for large enough n. But since 22n = 2n·2n, that would mean 2n·2n ≤ C·2n for large enough n. And if we divide by 2n on both sides, we get 2n ≤ C for large enough n, which is a falsehood since 2n grows arbitrarily large.
The second statement is true. We are given C and n0 such that f(n) ≤ C g(n) for n ≥ n0. Since log is monotone, we have log(f(n)) ≤ log(C g(n)) = log(C) + log(g(n)) ≤ (D + 1) log(g(n)) for n ≥ n0 where D is chosen so that log(C) ≤ D log(g(n)): if log(C) ≤ 0, we take D = 0, otherwise, we take D = log(C / log 2) using that g(n) ≥ 2. Thus, log f(n) ∈ O(log(g(n))).
Reflexivity (f(n) ∈ O(n)) holds trivially: we have f(n) ≤ 1 ∙ f(n) for n ≥ 1. For transitivity, if f(n) is bounded by C g(n) for n ≥ n₀ and g(n) is bounded by D h(n) for n ≥ n₁, then f(n) is bounded by CD h(n) for n ≥ max(n₀, n₁). Thus, the relation defined by O forms a preorder (called the order-of-growth order).
If O(f(n)) ⊆ O(g(n)), then f(n) ∈ O(g(n)) follows from reflexivity (f(n) ∈ O(n)). Conversely, if f(n) ∈ O(g(n)), then O(f(n)) ⊆ O(g(n)) follows using transitivity established above.
Let us construct f, g : ℕ → ℝ>0 such that neither f(n) ∈ O(g(n)) nor g(n) ∈ O(f(n)). There are many possible ideas, one is the following:
The ratio f(n) / g(n) grows arbitrarily large. Thus, there cannot be C and n₀ such that f(n) / g(n) ≤ C for n ≥ n₀. This shows that f(n) ∈ O(g(n)) does not hold. An analogous argument shows that g(n) ∈ O(f(n)) does not hold.
From this example, we can build another one whose functions are increasing. The idea is to multiply f and g by the same fast-growing function h, i.e. set f’(n) = h(n) f(n) and g’(n) = h(n) g(n). Then the ratios of f’ and g’ are identical to those of f and g, so neither f’(n) ∈ O(g’(n)) nor g’(n) ∈ O(f’(n)) hold. Picking the factorial function h(n) = n! makes f’ and g’ increasing.
Assume f(2n) ∈ O(f(n)). That means there are C and n0 such that f(2n) ≤ C f(n) for n ≥ n0. By induction on m, we get that f(2m) ≤ Cm f(1). Since f is increasing, we have
So f ∈ O(nk) for k = log2(C).