[Math] Taking the log of both sides to determine big Theta/Omega/O

asymptoticscomputer sciencelogarithms

I've managed to confuse myself over this detail:

Obviously: $n^2 \notin \Theta(n)$

Now if we take the $\log$ of both sides, we get:

$$\log(n^2) \leq \log(cn)$$

$$2\log(n) \leq \log(c) + \log(n)$$

Suddenly this equation looks true… that you can find a $c$ to satisfy the $\Theta$ definition. Obviously, I know this is not the case. Question is, when can you take the log of both sides to prove a O/Omega/Theta definition? What about for $2^n \in \Theta(n!)$?

Best Answer

As gmath shows, you can never take the logarithm of both sides in view of proving asymptotic estimates. More explicitly, you cannot deduce (say) $f = O(g)$ from $\log f = O(\log g)$. Stated differently, if $f = O(g)$ then it doesn't follow that $e^f = O(e^g)$. Even more simply, it is not the case that $e^{O(f)} = O(e^f)$. The root cause of this is the failure of $e^{O(x)} = O(e^x)$.

Which transformations are we allowed to use? If $\phi(O(x)) = O(\phi(x))$ then we can deduce $f = O(g)$ from $\phi^{-1}(f) = O(\phi^{-1}(g))$, since $$f = \phi(\phi^{-1}(f)) = \phi(O(\phi^{-1}(g))) = O(\phi(\phi^{-1}(g))) = O(g).$$ As an example, $\log(O(x)) = O(\log x)$, and so we can deduce $f = O(g)$ from $e^f = O(e^g)$.