[Math] Running time (Big O) of counting in binary

algorithmsasymptotics

What is the total running time of counting from 1 to $n$ in binary if the time needed to add 1 to the current number $i$ is proportional to the number of bits in the binary expansion of $i$ that must change in going from $i$ to $i+1$?

I can't figure out how to do this. If the number $n$ has $m$ bits, then we can say $n = 2^m – 1$ so the amount of bits used to represent the number is $m = \lceil log_2(n+1) \rceil$. The most amount of bits that will be flipped when incrementing a number is all $m$ bits. That's all I got. Where do I go from here?

Best Answer

The case where $n=2^a$ for some $a$ is a bit easier to analyze. Let's observe the list of binary numbers for $a=3$:

0001
0010
0011
0100
0101
0110
0111
1000

We're interested in how many times there's a bit that changes. The trick is to count not one number a time but one bit position at a time:

  • The ones bit changes for every number, so it flips $n$ times.
  • The twos bit changes for every second number, so it flips $n/2$ times.
  • The fours bit changes one time out of four so it flips $n/4$ times.

And so forth -- the total number of flips is $$\sum_{i=0}^a \frac{n}{2^i} = n \sum_{i=0}^a 2^{-i} < 2n$$ where the last inequality is because $\sum_{i=0}^a 2^{-i}$ is a prefix of the infinite geometric series with sum $2$.

Now, can you prove there are also $\le 2n$ flips when $n$ is not a power of $2$?

Related Question