Code with only one codeword is perfect.

coding-theory

I am reading A First Course of Coding Theory by Raymond Hill.

A $q$-ary $(n,M,2t+1)$-code such that $$M \{ {{n}\choose{0}}+{{n}\choose{1}}(q-1)+\dots+{{n}\choose{t}}(q-1)^t \}=q^n$$ is called a perfect code.

Consider a code with only single codeword, that is a $q$-ary $(n,1,d)$-code.
How to define the minimum distance of this code? since there is only one codeword in the code.

From Codes consisting of one codeword are perfect., it seems that $t=n$ so $d=2n+1$. But I cannot see that why that is true.

Best Answer

Well, the minimum distance is defined as $\min\{d(u,v): u, v \in C, u \neq v\}$ and for a one word code the set we take a minimum of is empty and $\min \emptyset$ is not defined.

The decoding algorithm is trivial: whatever you receive, you output the one codeword. The code is perfectly $n$-error correcting because you could change any number of the length $n$ vector components and the result is always correct.

For less trivial perfect codes the minimum distance is $2t+1$ if we can always uniquely correct any transmitted codeword with at most $t$ errors. This is the real definition of what it means for a code to be $t$-perfect. The stated formula is a corollary of that fact. So, as we can correct $n$ errors, it makes sense to define the minimal distance for this trivial code to be $2n+1$ too, so all perfect codes obey the same rule.

Note that indeed $$\sum_{i=0}^n \binom{n}{i}(q-1)^i=q^n$$ by the binomial theorem, so $t=n$ is consistent in your first formula as well, when $M=1$.

Related Question