[Math] Time efficiency of brute force algorithm as a function of number of bits

algorithmsasymptoticsbinary

This is homework help so advising how to solve such a problem is appreciated. The question reads as follows:

What is the time efficiency of the brute-force algorithm for computing $a^n$ as a function of $n$? What about as a function of the number of bits in the binary representation of $n$?

I figured out the first part as the basic operation being performed is simply the multiplication of $a$'s together and it will be done $n$ times so that efficiency is:
$a^n\in\Theta(n)$.
But can someone shed some light on the second portion?

Best Answer

Multiplication in bit level can be as easy as shifting bits which is $O(m)$. This is the case of $a=2^k$. (Note here the bit level arithmetic that left-shift is equivalent to multiplying by 2. $m$ here is the number of bits.) Or, plain shifting-and-adding and that's in $O(m^2)$.

If you are performing $a^n$ as plain multiplication of $a^{i-1} \times a$ in as many as those $n-1$ steps it takes, then the complexity becomes $O(m^3)$ in the worst case and is $O(m^2)$ in the best-- the case of $a=2^k$.

Related Question