[Math] Cyclic Redundancy Check of an input sequence using MATLAB

coding-theoryMATLAB

I understand how to mathematically calculate the cyclic redundancy check CRC of bits, for example if the CRC is of length 16

$$ CRC(D) = (M(D) + I (D)) D^{16} \,\,\, mod \,\,\,\ G(D)$$

where $M(D)$ is the polynomial for incoming bits, $I(D)$ is for initialization bits and $G(D)$ is the generator function so the CRC is the remainder of long division operation. And usually + means XOR operation

I try to understand how to program and come up with the CRC for some bits in MATLAB. My generator function is
$$G(D)= G^{16}+G^{12}+G^5+1$$ and the all ones initialization sequence to some incoming bits. I found the following example online

CRCInit = [1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1];


 [r c] = size(bits);
 CRCbits = CRCInit;
 dataout = [];
 for k1 = 1:c,
    C1 = CRCbits(1);
    C5 = CRCbits(5);
    C12 = CRCbits(12);

    C16 = xor(C1,bits(k1));    
    C4 = xor(C5,C16);
    C11 = xor(C12,C16);

    CRCbits = [CRCbits(2:end) C16];
    CRCbits(4) = C4;
    CRCbits(11) = C11;
    dataout = [dataout bits(k1)];
end

What I don't understand is how this program would execute such an operation. I am looking for few words to understand why such program would execute this operation?

Thanks for your help

Best Answer

This is my understanding of Usage of XOR in polynomial division:

Consider a sequence with polynomial $x^4+x^3+x^2+1$ and you want to divide it by $x^2+1$ for example.

$$x^4+x^3+x^2+1=x^2(x^2+1)+x^3+1=x^2(x^2+1)+x(x^2+1)-x+1$$ The remainder is $-x+1\equiv x+1 \pmod{2}$

The division process is equal to the following binary format:

 1 1 1 0 1  >>>>x^4+x^3+x^2+1
 1 0 1      >>x^2(x^2+1)
 ---------  XORing bits
 0 1 0 0 1  >>>x^3+1 >>remainder of first step
   1 0 1    >>x(x^2+1)
 ---------  XORing bits
 0 0 0 1 1  >>>x+1   >>remainder of 2nd step

first $x^2(x^2+1)$ will be subtracted from $x^4+x^3+x^2+1$ then $x(x^2+1)$ and so on. XOR will do the subtraction mod 2.