Number Theory – Proof That Powers of Two Cannot Be Expressed as Sum of Multiple Consecutive Positive Integers Using Binary Representations

arithmeticbinarynumber theory

In this earlier question, the OP asks for a proof of the statement

Every natural number not of the form $2^k$ for some natural number k can be written as the sum of two or more consecutive positive integers.

The proofs in that answer are all interesting and valid. However, I was curious as to why powers of two cannot be formed this way. It would seem that since powers of two and addition are involved that there would be some proof that powers of two cannot be expressed as the sum of two or more consecutive positive integers that works by exploring properties of addition of binary numbers. That is, we could prove this result by showing that binary addition of multiple consecutive integers, as an algorithm, is incapable of producing a power of two.

Does anyone know of such a proof or how to get started with one? I honestly can't figure out how such a proof would work, but I feel that if one exists it's likely to be extremely interesting.

Thanks!

Best Answer

Well, the sum of $k>1$ consecutive positive integers starting at $n$ is $\frac{(n+k)^2+n+k}{2}-\frac{n^2+n}{2}=\frac{2kn+k^2+k}{2} = \frac{k}{2}(2n+k+1)$. If we want to get $2^m$ we must have $2^{m+1}=k(2n+k+1)$ so $k=2^j$ for some $j\in\mathbb{N}$. But then $2^m = 2^jn+2^{2j-1}+2^{j-1}$. If we write this as a binary expansion, the last $j$ digits of $2^j$ must be $0$, the last $2j-1\geq j$ digits of $2^jn$ must be $0$, while $2^{j-1}$ has a $1$ in the last $j-1$ digits, and so the sum has a $1$ in the last $j-1$ digits. Since $2^m$ must be at least $2^{2j-1}\geq2^j$, this cannot be the case, so $2^m$ cannot be written as the sum of $k>1$ consecutive positive integers.

Related Question