Proof: Time complexity of deterministic vs nondeterministic Turing machines

asymptoticscomputational complexityinequalityturing-machines

I am stuck trying to understand a proof in my book for why, given a nondeterministic single-tape Turing machine $N$ that runs in time $t(n) \geq n$, the deterministic single-tape Turing machine $D$ that simulates it will run in time $2^{O(t(n))}$.

I do understand the idea of the proof: $D$ simulates $N$ by doing a breadth-first search of $N$'s computation history tree starting from the root node. Every node of the tree has $b$ possible branches given by $N$'s transition function, and any branch of $N$ has length at most $t(n)$. So the running time of $D$ is $O(t(n)b^{t(n)})$ because it has to go at most down $b^{t(n)}$ branches that have a maximum length of $t(n)$ each.

So far so good. Then the book just says $O(t(n)b^{t(n)}) = 2^{O(t(n))}$ with no further explanation, and this concludes the proof. Where does this equation come from? Where does the $2$ come from? Thank you!

Best Answer

For any fixed $b$, we have $b < 2^k$ for some $k$ large enough, and $t(n) < 2^{t(n)}$ for $n$ large enough. So for $k,n$ large enough we have $$ t(n) b^{t(n)} \le 2^{t(n)} \left(2^k\right)^{t(n)} = 2^{t(n) + kt(n)} = 2^{O(t(n))}. $$

Related Question