I recently came across Dan Piponi's blog post An End to Coding Theory and it left me very confused. The relevant portion is:
But in the sixties Robert Gallager looked at generating random sparse syndrome matrices and found that the resulting codes, called Low Density Parity Check (LDPC) codes, were good in the sense that they allowed messages to be transmitted at rates near the optimum rate found by Shannon – the so-called Shannon limit. Unfortunately the computers of the day weren't up to the task of finding the most likely element of M from a given element of C. But now they are. We now have near-optimal error correcting codes and the design of these codes is ridiculously simply. There was no need to use exotic mathematics, random matrices are as good as almost anything else. The past forty years of coding theory has been, more or less, a useless excursion. Any further research in the area can only yield tiny improvements.
In summary, he states that the rate of LDPC codes is very near channel capacity—so near that further improvements would not be worth the while.
So, my question is: What does modern research in error-correcting codes entail? I noticed that Dan did not mention what channel the rate of LDPC codes approach the capacity of, so maybe there exist channels that LDPC codes don't work well on? What other directions does modern research in the field explore?
Best Answer
This is largely true in the sense that long LDPC codes are within a fraction of a dB of the Shannon limit in AWGN channel. Their design (by a process called density evolution - don't look at me, I don't know the math) also seems to be relatively simple. It depends on stochastic math as opposed to algebra/combinatorics of finite fields/rings.
My limited experience only allows me to make the following remarks:
Caveat: I have only studied this cursorily (enough to implement an LDPC decoder and simulate/analyze some of the codes). These are recollections and impressions rather than hard facts. You can get the minutes of the relevant meetings of workgroup in charge of this within LTE from the web pages of 3GPP (=third generation partnership project).
Other types of codes will remain to be useful for specific purposes. There is no substitute to RS (or AG, if you need longer blocks), when you only have hard byte errors. LDPC (and Turbo) codes are designed with soft bit errors (=reliability estimates of received bits) in mind. This is a valid assumption in all wireless communication. Also network coding appears to be a hot topic currently. There the code operates at the level of data packets that may or may not arrive at all. Also look up Raptor code, if this type of problems are of interest to you.
Algebraic coding theory is not dead yet, but it does concentrate on problems other than designing ECCs for bulk data transmission. An algebraic machinery yielding LDPC code designs guaranteed to have very low error floors (= high minimum Hamming distance) may be out there, but I don't know what it would be.