[Math] Parity check matrix determines minimal distance of a linear code.

coding-theoryproof-verification

Let $C$ be a linear code with parity check matrix $H$. Then $d(C) = d$ if and only if every set of $d − 1$ columns of $H$ is linearly independent and some set of $d$ columns of $H$ is linearly
dependent.

The proof in my script works as follows:

Let $v \in C$ and $i_1, …, i_u$ be the non-zero components of $v$. Then, the columns $i_1, …, i_u$ of the parity check matrix must be linear independent. When we choose a vector of weight $d$ for $v$, we receive $d$ linear independent columns of $H$. On the other hand, the condition $u \ge d$ must hold if you can always pick $d-1$ linear independent columns.

I really don't see how this proof works. I understand why the columns $i_1, …, i_u$ of the parity check matrix must be linear independent when they are the non-zero components of $v$, but the rest of the proof doesn't tell me anything.

Best Answer

The proof seems wrong. It should be "Then, the columns $i_1,...,i_u$ of the parity check matrix must be linearly dependent." Also, one should put apart the null codeword. Corrected:

Let $v \in C$ ,$v\ne 0$, and $i_1, ..., i_u$ be the non-zero components of $v$. Then, the columns $i_1, ..., i_u$ of the parity check matrix must be linear dependent. When we choose a vector of weight $d$ for $v$, we receive $d$ linear dependent columns of $H$. On the other hand, the condition $u \ge d$ must hold if you can always pick $d-1$ linear independent columns.

I agree that it can be confusing. I would rather prove it thus:

First assume $d(C)=d$. Then, there is some codeword $c_0$ with weight $d$. Then, because $c_0 H^t =0$, this implies that $d$ columns of $H$ (those corresponding to the non-zero elements of $c_0$) are LD (linearly dependent). Furthermore, there cannot be a set of $m$ LD columns, with $0<m<d$; because in that case we could write $ x H^t=0$ for some vector $x$ with $m<d$ non-zero elements - but the vectors that verify that equation are the codewords, and there is no (non null) codeword with weight less than $d$.

The other direction goes similarly. If there is some set of $d$ LD columns , but no such set of $0<m<d$ columns, the (non null) solutions of $x H^t$ have minium weight $d$ - but these are the codewords, hence the distance is $d$.