Does Cyclic Redundancy Check result in more extra bits overhead than Hamming code

coding-theory

For a data word of $16$ bits which is the most correct claim:

  1. Hamming code extra bits overhead will be greater than CRC because Hamming code corrects one error while CRC can only detect errors.
  2. CRC extra bits overhead is greater because Hamming code uses minimal number of bits for encoding.
  3. Hamming code extra bits overhead is greater only the degree of the generating polynomial is less than $5$.

First of all I think $3)$ point is incorrect because when using CRC you need to add as many check bits to the data as the degree of the polynomial you use while you don't have to choose polynomials of high degrees necessarily. For, example one could use $x^4+x+1$ which would require $5$ bits while $10$ bits would be needed to correct a single error using Hamming code for a $16$-bit word.

Thus I think the $1)$ point is the correct that Hamming code overhead is higher because it needs to store more information in order correct errors. Am I missing something?

Best Answer

The correct answer should be point 2). This is because bit errors and bursts of up to $n$ bits will be detected by polynomial $P(x)$ of degree $n$. This means that $16$ check bits would be required to detect errors for the data. Hamming code needs $\log n$ check bits for a word of length $n$ in order to detect single error which is only $4$ bits in the example.

Related Question