[Math] What do I get by adding $\Theta(n)$ and $O(n)$

algorithmsasymptotics

I know that that $f(n) = O(n)$ means that $n$ is the asymptotically upper bound of $f(n)$ and that $\Theta(n)$ is the asymptotically tight bound of $f(n)$. Still, I'm wondering whether I am allowed to say $\Theta(n) + O(n) = \Theta(n) = O(n)$?

The $O$ and $\Theta$-notation are defined as:

$\Theta(g(n))$ = { $f(n)$: there exists positive constants $c_1$, $c_2$ and $n_0$ such that $0 \leq c_1g(n) \leq f(n) \leq c_2g(n) $ for all $n \geq n_0$ }

$O(g(n))$ = { $f(n)$: there exists positive constants $c$ and $n_0$ such that $0 \leq f(n) \leq cg(n)$ for all $n \geq n_0$ }

(this is in the context of analyzing the running time of an algorithm)

Best Answer

As long as you assume that everything is positive (which will usually be the case in context of algorithm analysis) then clearly from $f(n) \in O(n)$ and $g(n) \in \Theta(n)$ it follows that $f(n) + g(n) = \Theta(n)$.

That that class $O(n)$ is closed under addition and that it contains the class $\Theta(n) = O(n) \cap \Omega(n)$ is hopefully quite clear. In other words, since $g(n) = O(n)$, you know that $f(n) + g(n) = O(n)$.

On the other hand, if $f(n) \geq 0$ then $f(n) + g(n) \geq g(n)$, so from $g(n) = \Omega(n)$ you get $f(n) + g(n) = \Omega(n)$. Since now you have both the upper bound and the lower bound, you an conclude that $f(n) + g(n) = \Theta(n)$. If you don't assume $f(n) \geq 0$, then you only get the lower bound $f(n) + g(n) = O(n)$

Be careful with notations like $O(n) + \Theta(n) = \Theta(n)$...

Related Question