Prove that that n is $O(2^n)$ and show that $\log n$ is $O(n)$.

algorithmsasymptoticslogarithms

n < $2^n$ whenever n is a positive integer. Show that this inequality implies that n is $O(2^n)$ and use this inequality to show that log n is O(n).
There is a book solution but I am having a hard time understanding in laymans terms.

Solution: Using the inequality $n < 2^n$, we quickly can conclude that n is $O(2^n)$ by taking k =
C = 1 as witnesses. Note that because the logarithm function is increasing, taking logarithms
(base 2) of both sides of this inequality shows that
$logn < n$.
It follows that
log n is O(n).
(Again we take C = k = 1 as witnesses.)
If we have logarithms to a base b, where b is different from 2, we still have $\log_b n$ is O(n)
because

$\log_b n = \frac{\log n}{
log b}
<
\frac{n}{
log b}$

whenever n is a positive integer. We take C = 1/ log b and k = 1 as witnesses.

So, how do we conclude n is $O(2^n)$? Using log rules but how do we know its increasing? This is where I start to get lost and it makes it hard to piece together the rest of the solution. Please help.

Best Answer

Recall the definition of $f(n) \in O(g(n))$: there exists some $k$ and $C > 0$ such that

$$n \ge k \implies |f(n)| \le C|g(n)|.$$

We have $n < 2^n$, as in the question, for $n \ge 1$. Therefore, taking $k = 1$ and $C = 1$, we have $$n \ge 1 \implies |n| = n \le 2^n = |2^n|$$ so $n \in O(2^n)$.

But, you're also right that $n \in O(n)$ too. These are not mutually exclusive. We have $$n \ge 1 \implies |n| \le |n|,$$ which shows $n \in O(n)$. In fact, if $f(n) \in O(g(n))$ and $g(n) \in O(h(n))$, then it's a good exercise to show that $f(n) \in O(h(n))$.

You said, in the comments, that your understanding of $n < 2^n$ implies that at some point $k$, $2^n$ will grow faster than $n$. It's not entirely clear what you mean here, but I wouldn't sweat it too much. The point is, $2^n$ starts bigger than $n$, and stays that way forever afterwards, fulfilling the requirement for $n \in O(2^n)$.


Now, consider the case where $b \neq 2$. This is one situation where I think the book needs a little bit more detail. In particular, I think we should also assume $b > 1$. The case for $0 \le b \le 1$ should be handled separately (and the $b < 0$ case handled separately again). I will assume $b > 1$.

Since $\log_2 n < n$ for all $n \ge 1$, we can use the change of base formula. Recall that $$\log_b n = \frac{\log_2 n}{\log_2 b}.$$ (If you don't accept this formula, try looking it up online.) But, since $b > 1$, it follows that $\frac1{\log_2 b} > 0$, so $$\log_2 n < n \implies \log_b n = \frac{\log_2 n}{\log_2 b} < \frac{1}{\log_2 b} n.$$ This proves that $\log_b n \in O(n)$, using $C = \frac{1}{\log_2 b}$.

Related Question