Minimum distance of a linear code and its relation with a control matrix of the code

coding-theorydiscrete mathematics

I am trying to prove that the minimum distance of a code $\mathcal{C}$ is $d$ if and only if $d$ is the biggest integer that satisfies that any $d-1$ columns of $H$ are linearly independent (being $\mathcal{C}$ an $[n,k,d]$ linear code and $H$ a control matrix of $\mathcal{C}$). I have read other answers related to this result, but I am not getting the point at all… if someone could check my attempts I will be very thankful.

I started supposing that $d$ was the greater integer that satisfies that any $d-1$ columns of $H$ are linearly independent, and from this hypothesis I an willing to get to the fact that $d$ is the minimum distance of the code. For seeing that $d=d(\mathcal{C})$, I will show:

  • $\boxed{d(\mathcal{C})\geq d}$ For this inequality, I started taking $c=(\alpha_{1},…,\alpha_{n})\in\mathcal{C}\setminus\{\mathbf{0}\}$; let's name the coloumns of $H$ as $h_{1},…,h_{n}$, then, as $H$ is a control matrix, $Hc^{T}=(0,…,0)^{T}$; this directly implies that $\sum_{i=1}^{n} \alpha_{i}h_{i}=0$; proceeding by contradiction, lets suppose that $\omega(c)\leq d-1$ (being $\omega(c)$ the weight of $c$, i.e, the number of components of $c$ different from $0$); from this, we will conclude that there are going to be $i_{i},…,i_{r}\in\{1,…,n\}$ indexs with $r\leq d-1$ such that $\sum_{l=1}^{r}\alpha_{i_{l}}h_{i_{l}}=0$ with $\alpha_{i_{i}},\alpha_{i_{2}},…,\alpha_{i_{r}}\neq 0$; this will imply that there is a set of columns of $H$ with cardinal less or equal than $d-1$ that is a family of linearly dependent vectors, which is contradictory with the hypothesis that every set of $d-1$ columns of $H$ is linearly independent; so we can can conclude that $\omega(c)\geq d$; as we have chosen $c$ as an arbitrary non-zero word of the code, we can take $c\in\mathcal{C}\setminus\{\mathbf{0}\}$ such that $\omega(c)=\displaystyle\min_{c\in\mathcal{C}\setminus\{\mathbf{0}\}}\omega(c)$; and then, knowing that the minimum distance of a linear code is the minimum weight that the non-zero words of the code reach, we can conclude that $\displaystyle\min_{c\in\mathcal{C}\setminus\{\mathbf{0}\}}\omega(c)=d(\mathcal{C})\geq d$
  • $\boxed{d\geq d(\mathcal{C})}$ If I now take all the columns of $H$, any combinations of $d-1$ of them will form a linearly-independent vector family, and any combinations of $\geq d$ of them, will form a linearly-dependent vector family (this is by hipothesis); it's easy to show that if $\beta_{1}h_{1}+\beta_{2}h_{2}+…+\beta_{n}h_{n}=\mathbf{0}$, $\beta_{i}=0$ for $n-d$ indices and the rest of the $\beta_{i}$ don't have to be zero. Thus, I would conclude, as:
    • If I take $\tilde{c}=(\beta_{1},…,\beta_{n})$, $H\tilde{c}^{T}=\mathbf{0}$, so as $H$ is a control matrix, $\tilde{c}\in\mathcal{C}\setminus \{\mathbf{0}\}$.
    • Obviously $\omega(\tilde{c})\leq d$ (as $d$ of the $\beta_{i}$ shouldn't be neccesarily equal to $0$), and as $d(\mathcal{C})=\displaystyle\min_{c\in\mathcal{C}\setminus\{\mathbf{0}\}}\omega(c)\leq \omega(\tilde{c})$, we can conclude that $d(\mathcal{C})\leq d$.

(*) Then, I will suppose that $d=d(\mathcal{C})$; it seems to me that I must proceed by contradiction; so being $p:$ ''d is the biggest integer that satisfies that any $d-1$ columns of $H$ are linearly independent'', I assume that I would have to suppose $\lnot p:$ ''d is not the biggest integer that satisfies that any $d-1$ columns of $H$ are linearly independent''… But I am quite stuck, and I don't how to continue…

SUMMING THE PROBLEMS I HAVE FOUND:

  1. (*) I don't really know how to prove this direction of the ''if and only if''

Thanks in advance!

Best Answer

Denote by $d(H)$ the largest integer s.t. any $d(H)-1$ columns in $H$ are linearly independent. You have shown that two integers, $d(\mathcal{C})$ and $d(H)$, are identical, i.e., $d(H) = d(\mathcal{C})$. So you are done, as this is the same as saying $d(H)=d\Longleftrightarrow d(\mathcal{C}) = d$ for some $d$.

There is however one small technical flaw in your argument (which luckily does not affect the proof). You write, "any combinations of $\geq d$ of them, will form a linearly-dependent vector family (this is by hipothesis)}". This is however not true. There might be $d$ columns which are linearly independent, as $d$ is defined largest integer such that any $d-1$ columns are linearly independent.

Related Question