[Math] Master Theorem $T(n) = 4T(n/2) + \lg n$

algorithmscomputer science

In class today, we did the following problem: $T(n)=4T(n/2) + \lg n$

So by notation in CLRS, we have $a = 4$, $b = 2$, $f(n) = \lg n$. Thus,
$n^{\log_b a} = n^2$. My algorithm lecturer claimed that
it doesn't fit Case 1 of the Master Theorem because "$\lg n$ is not polynomially smaller than $n^2$".

By Case 1, I mean the one described in CLRS, which is
similar to the one described in the wiki page: "If $f(n) = O(n^{\log_b a – \epsilon})$ for some constant $\epsilon > 0$, then
$T(n) = \Theta(n^{\log_b a})$"

However, from what I can see, $\lg n = O(n^{2 – \epsilon})$ when $\epsilon = 1$. Doesn't that mean that it fits case 1 then?

So my question is:

  • If I'm wrong, what did I miss on?
  • If I'm right, what is an example that doesn't fit Case 1?

EDIT

Here's the statement of the Master Theorem copied from the book:
Let $a \geq 1$ and $b > 1$ be constants, let $f(n)$ be a function, and let $T(n)$ be defined on the nonnegative integers by the recurrence

$T(n) = aT(n/b) + f(n)$

where we interpret $n/b$ to mean either $\lfloor n/b \rfloor$ or
$\lceil n/b \rceil$. Then $T(n)$ has the following asymptotic bounds:

  1. If $f(n) = O(n^{log_b a – \epsilon})$ for some constant $\epsilon > 0$, then
    $T(n) = \Theta(n^{log_b a})$

  2. If $f(n) = \Theta(n^{log_b a})$, then $T(n) = \Theta(n^{log_b a} \lg n)$

  3. If $f(n) = \Omega(n^{log_b a + \epsilon})$ for some constant $\epsilon > 0$, and if $af(n/b) \leq cf(n)$ for some constant $c < 1$ and all sufficiently large $n$, then $T(n) = \Theta(f(n))$

Best Answer

Yes, you are correct. Case 1 applies, and the solution is Theta(n^2).

Now consider $T(n) = 4T(n/2) + n^2$.

Here, case 1 does not apply because $n^2$ is not $O(n^{2-\epsilon})$ for any positive $\epsilon$. (But case 2 applies.)