Solving recurrence relation T(n/2) + n substitution method

calculuscomputational complexityrecurrence-relations

I am studying the substitution method to find the asymptotic behavior of a recurrence relation and I was trying to prove that:
$$
T(n) = \begin{cases}
1, & n = 1 \\
T(\lfloor\frac{n}{2}\rfloor) + n & n > 1
\end{cases}
$$

is O(log(n)).

I know that the proof fails, but I do not understand why it fails, I misunderstand the data.

I tried this.
$$
T(n) = T(\frac{n}{2}) + n \le c(\log(\frac{n}{2})) + n = c\log(n) – c\log(2) + n = c\log(n) – c + n
$$

Since I'm trying to prove that $c\log(n) – c + n$ = O($\log(n)$), I have to go like this:
$$
c\log(n) – c + n \le c\log(n)
\\
c – n \ge 0
\\
c \ge n
$$

If I am not wrong and my proof succeeded, this result should mean that T(n) = O(log(n)), $\forall c \ge n$, so T(n) should have a logarithmic complexity but only for all the c that are greater or equal than n.
But all the proof, should fail and I don't understand why, where am I wrong?

Best Answer

$T(n)=O(\log(n))$ is not a statement about any particular $n$, but how the function $T(n)$ grows in the long term.

The condition $c\ge n$ forces $c$ grows with $n$, therefore it goes to infinity as $n$ gets large, hence $c$ cannot be a fixed constant, and $T(n)\le c \log(n)$ implies $T(n)=O(\log(n))$ only when $c$ is a constant.

Indeed, from $T(n) = \text{blah} + n \ge n$, you can easily see that it cannot be bounded by any $c\log(n)$ for a fixed $c$ (may be large, but doesn't grow as $n$ grows).

One way to solve this is to see that the total work is equal to $n + \frac{n}{2} + \frac{n}{4} + \cdots \le n \sum_{i=0}^\infty \frac{1}{2^i} = 2n$. Hence $T(n)\le 2n$ which can also easily be proved by induction (here $2$ is a constant that's independent of $n$.)

Now since $T(n)\ge n$ and $T(n)\le 2n$, we actually have $T(n)=\Theta(n)$.