Information Theory – Has Error Correction Been Solved?

coding-theoryinformation theorysoft-question

I recently came across Dan Piponi's blog post An End to Coding Theory and it left me very confused. The relevant portion is:

But in the sixties Robert Gallager looked at generating random sparse syndrome matrices and found that the resulting codes, called Low Density Parity Check (LDPC) codes, were good in the sense that they allowed messages to be transmitted at rates near the optimum rate found by Shannon – the so-called Shannon limit. Unfortunately the computers of the day weren't up to the task of finding the most likely element of M from a given element of C. But now they are. We now have near-optimal error correcting codes and the design of these codes is ridiculously simply. There was no need to use exotic mathematics, random matrices are as good as almost anything else. The past forty years of coding theory has been, more or less, a useless excursion. Any further research in the area can only yield tiny improvements.

In summary, he states that the rate of LDPC codes is very near channel capacity—so near that further improvements would not be worth the while.

So, my question is: What does modern research in error-correcting codes entail? I noticed that Dan did not mention what channel the rate of LDPC codes approach the capacity of, so maybe there exist channels that LDPC codes don't work well on? What other directions does modern research in the field explore?

Best Answer

This is largely true in the sense that long LDPC codes are within a fraction of a dB of the Shannon limit in AWGN channel. Their design (by a process called density evolution - don't look at me, I don't know the math) also seems to be relatively simple. It depends on stochastic math as opposed to algebra/combinatorics of finite fields/rings.

My limited experience only allows me to make the following remarks:

  1. The block length of an LDPC code needs to be relatively large. This is one of the reasons they were not chosen to LTE standard (European/Asian 4G cellular standard). My understanding about the `best' code in a radio channel is (as a function of the block length): less than 100 bits => use an algebraic block code, 40-400 bits => use a convolutional code, 300-20000 bits => use a turbo code, 5000 bits+ => use an LDPC code (the intervals overlap as I am not willing to bet the family fortune on any one of them). IIRC this is how it is done in LTE (release 9), except that in that cellular standard the longest data block has 6144 bits (so handled with a turbo code).
  2. The design of an LDPC code needs to take into account that it is usually not practical to base it on a random Tanner graph with the prescribed weight distribution. In particular, when it is supposed to be used on a handheld device. The routing problem of the belief propagation algorithm becomes prohibitive otherwise. So the LDPC codes specified for radio communication standards a few years back (DVB-S/T2 in Europe, Mediaflo in the US) were designed around a particular decoding circuitry, or rather the Tanner graphs had to come from a certain kind of a family. I have never been truly up to speed with these aspects, so some progress may have been made.
  3. When higher order modulation is used there are further problems and considerations that need to be taken into account. It is not entirely obvious that we should stick to LDPC codes based on bits alone in that case, because the channel will create dependencies among the LLRs of individual bits. But the BP algorithm becomes very complicated, if we want to feed something other than LLRs of individual bits to it.
  4. Getting rid of the error floors is still a challenge. Alas, I'm not up to speed with that either. Possible workarounds involve using catenated code with a light weight algebraic outer code handling the residual errors.

Caveat: I have only studied this cursorily (enough to implement an LDPC decoder and simulate/analyze some of the codes). These are recollections and impressions rather than hard facts. You can get the minutes of the relevant meetings of workgroup in charge of this within LTE from the web pages of 3GPP (=third generation partnership project).

Other types of codes will remain to be useful for specific purposes. There is no substitute to RS (or AG, if you need longer blocks), when you only have hard byte errors. LDPC (and Turbo) codes are designed with soft bit errors (=reliability estimates of received bits) in mind. This is a valid assumption in all wireless communication. Also network coding appears to be a hot topic currently. There the code operates at the level of data packets that may or may not arrive at all. Also look up Raptor code, if this type of problems are of interest to you.

Algebraic coding theory is not dead yet, but it does concentrate on problems other than designing ECCs for bulk data transmission. An algebraic machinery yielding LDPC code designs guaranteed to have very low error floors (= high minimum Hamming distance) may be out there, but I don't know what it would be.

Related Question