[Math] Algorithm using the cases of Master Theorem

algorithms

I got these solutions and I would like to see whether I am right or not please.

The question is:

Prove asymptotic bounds for the following recursion relations. Tighter bounds will receive more marks. You may use the Master Theorem if it applies.

  1. $C(n) = 3 C(n/2) + n.$

    For this I went with case 1 of the Master Theorem and I got that $n= O(n^{0.6})m$ so that

    $$T(n) = \Theta(n^{\log_b(a)}).$$

  2. $G(n) = G(n – 1) + 1/n.$

    In order to apply the Master Theorem I rename $n = \log m$ so I got

    $$T(\log m)= T(\log(m/2)) + 1/n.$$

    At the end I got $a=1$, $b=2$, $f(n)=1/n$ so I apply the case $f(n)= O(n^{\log_b \ a}-1)$.

  3. $I(n) = I(n/2) + n/\log(n)$

    I know the difference is not polynomial and the Master Theorem does not apply and I might use the ratio $(n \log n)/(n \log_b a)$, but I could not prove it.

Best Answer

For the first one, $a = 3, b = 2, f(n) = n$. Since $f(n) = n = O(n^(log_2(3)-e))$ for, e.g., $e = 0.1$, the first case of the master theorem applies. The complexity is therefore $O(n^(log_2(3)))$ ~ $O(n^1.6)$, so you got the right answer.

Your work for the second is wrong when you apply the Master theorem to T(lg m) when its expression contains the n term. As I answered on SO before your question was closed there, we can see from the first few terms of this recurrence - $T(1) = 1$, $T(2) = 1 + 1/2$, $T(3) = 1 + 1/2 + 1/3$, ... - this is just the harmonic series, plus some arbitary constant. Since the growth order of the kth partial sum of the harmonic series grows on the order O(log k), this is what your answer must turn out to be. If it were me, I'd simply find a proof of this fact and use that to prove the order of your recurrence.

For the third one, it's a little tricky. Right away we know that $I(n) < J(n)$ where $J(n) = J(n/2) + n$. Since $J(n) = Theta(n)$ by the Master Theorem, we can say that $I(n) = O(n)$. We may also gain some insight by observing $I(n) > K(n)$ where $K(n) = K(n/2) + n^0.999...9$, and since $K(n) = Theta(n^0.999...9)$ by the Master Theorem, we know $I(n) = Omega(n^0.999...9)$ (remember, any polynomial - even $n^0.000...1$ - grows faster than a logarithm).

Given this, the answer is probably a function of the form n/log^k(n).We can guess k = 1 and see whether we have enough to solve it:

Assume $I(n) <= cn/log(n)$. Then is it the case that $I(2n) = I(n) + 2n/log(2n) <= cn/log(n) + 2n/log(2n) <= 2cn/log(2n)$? $2n/log(2n) <= cn(2/log(2n) - 1/log(n))$, so $2 <= c(2 - log(2n)/log(n))$, so $2 <= c(2 - 1/log(n) - 1)$

As n increases, the RHS decreases asymptotically to c; in particular, for $n = 4$, we have $2 <= c(2 - 1/2 - 1) = c/2$, and the choice $c = 4$ works.

So we have an even better bound than before, namely, $I(n) = O(n/log(n))$. We can check whether, possibly, $I(n) = Omega(n/log(n))$ as well. If so, we're done. Otherwise, we might be able to keep going on k.

Assume $I(n) >= cn/log(n)$. Is $I(2n) = I(n) + 2n/log(2n) >= cn/log(n) + 2n/log(2n) >= 2cn/log(2n)$? $2n/log(2n) >= cn(2/log(2n) - 1/log(n))$, so $2 >= c(2 - log(2n)/log(n))$, and $2 >= c(2 - 1/log(n) - 1)$.

Again, as n goes to infinity, the RHS goes to c. In particular, for $n = 4$, we have 2$ >= c/2$. We can choose $c = 4$ and get that $I(n) = Omega(n/log(n))$.

Since we have $I(n) = O(n/log(n))$ and $I(n) = Omega(n/log(n))$, we know $I(n) = Theta(n/log(n))$, and we're done. Just to be clear, we have found that $2n/log(n) <= I(n) <= 4n/log(n)$ for $n >= 4$.