[Math] How is the phrase “$n$-bit number” related to the big $O$ notation

algorithmsasymptotics

When it comes to algorithms, you frequently have to evaluate problems like this:

Let $x$ be an $n$-bit integer. For each of the following questions, give your answer as a function of $n$.

Or a question like this:

[given algorithm] Assume that the subtraction takes $O(n)$ time on an $n$-bit number.

What does "let $x$ be an $n$-bit integer" really mean? It is just the amount of bits reserved for a random int variable $x$?
How does the $n$-bit number relate to the big $O$ of $n$ notation?

Best Answer

We usually represent numbers as finite sequence of digits. In base-$2$ each digit is called bit and has value $0$ or $1$. So we write each number $a$ as $\sum_{k=0}^n a_i2^i$, where $a_i$ are digits and assuming $a_n\neq0$ number $a$ is called $n$-bit number. Notice that if $a$ is $n$-bit number then $2^{n-1}\leq a<2^n$. So number of bits of number $a$ is $O(\log a)$.

If algorithm works on individual digits, then the complexity is dependant on length of given number (by length I mean number of bits). For instance how does addition algorithm works? You want to add $a$ and $b$, first you add $a_0$ and $b_0$, write down the result, if necessary carry $1$, continue for next bit and so on. As a result you do $n$ bit additions, so adding two $n$-bit numbers has complexity $O(n)$.

So basicly, saying that algorithm takes time $O(n)$ for $n$-digit number, means that it is linear with respect to the length of given number. And also that it takes time $O(\log n)$ with respect to the number itself.

Related Question