[Math] Master theorem with $f(n) = n\log(\log n)$

algorithmscomputational complexityrecurrence-relations

I have a question related to algorithm time complexity and master theorem.

How to solve this master theorem $T(n) = 2T(n/2) + n\cdot \log(\log(n))$?

We have 3 cases:

aaa

I don't know which one to use and why.

Best Answer

This relation can not be used with the Master Theorem because $f(n)=n\log \log n$ does not meet any of the cases.

  • Is $f(n)=O(n^{1-\epsilon})$? No, because $n\log \log n$ grows faster than $n$.
  • Is $f(n)=\Theta(n\cdot\log^k n)$? No, because $n\cdot\log\log n$ grows slower than $n\log n$.
  • Is $f(n)=\Omega(n\cdot\log^k n)$? No, because $f(n)=o(n\log n)$ and $n\log n=o(n\cdot\log^k n)$.

This isn't really a rigorous proof, but that's the logic behind why the Master Theorem doesn't apply here. In general, the Master Theorem only applies to polynomials or polylogarithmic complexities and $n\cdot\log\log n$ are neither of those.

Related Question