Computing minimum amount of redundancy bits for an error-correcting binary linear code given maximum number of errors

abstract-algebracoding-theorydiscrete mathematics

I understand that the minimum number of redundancy bits $r$ needs to satisfy $2^{r} \geq m + r + 1$, with m being the message length. However, how do I find the minimum number of redundancy bits given that there are at most $x$ number of errors in the code?

I am trying to solve the following question:

A satellite transmits a message of length $n = 100$, which may contain at most $2$ errors. Prove that at least 13 bits should be redundant.

Obviously if there's at the message size is at least $n = 100$, that means that:

$$
2^{r} \geq 100 + 2 + 1\\
r \geq log_2(103)\\
r \geq 7
$$

However, the problem states that at least $13$ bits must be redundant and my feeling is that it has to do with the fact that the message may contain at most $2$ errors. So I'm wondering how do I compute the minimum number of redundancy bits given not only the message length, but also the maximum number of errors in the code.

Best Answer

Let $m$ be the number of message bits. Any message can be correctly sent in one way, and incorrectly sent in $\binom{100}1+\binom{100}2$ ways. For each possible message, $M$, if we let $S_M$ be the set of strings that could be sent when you try to communicate that message, we have show that $|S_M|=1+\binom{100}1+\binom{100}2$. Furthermore, if $M$ and $M'$ are distinct messages, then $S_{M}$ and $S_{M'}$ must be disjoint sets, else $M$ and $M'$ could be confused.

Since the set of all $2^{100}$ strings of length $100$ contains $2^m$ disjoint sets each with size $1+\binom{100}1+\binom{100}2$, we conclude $$ 2^m\cdot \left[1+\binom {100}1+\binom{100}2\right]\le 2^{100} $$ Solving this, you get $m\le 87$, so there must be at least $100-87=13$ redundant bits.

Related Question