[Math] Fewest operations on an algorithm

algorithmscomputer sciencediscrete mathematicslogic

I'm very stumped on this problem I have and was wondering if anyone could lend me a hand here?

Suppose that you have two different algorithms for solving a problem. To solve a problem of size $n$, the first algorithm uses exactly $(n^2)(2^n)$ operations and the second algorithm uses exactly $n!$ operations. As $n$ grows, which algorithm uses fewer operations?

Best Answer

For $n\ge 3$ we have $n! \ge n^2 2^n$.

Indeed, let's try a proof by induction: $$ (n+1)! = (n+1)n! \ge (n+1) n^2 2^n $$ When do we have $(n+1) n^2 2^n \ge (n+1)^2 2^{n+1}$ so that we can complete the induction step?

This simplifies to $n^2 \ge 2(n+1)$, which holds iff $n\ge3$.

Related Question