Channel coding, aka, error control codes, is a foundational building block in almost all modern communication systems. Over the decades there has been a long list of champions and pretenders for the crown of supreme code du jour or perhaps more accurately, code de la génération. As we approach our fifth generation of wireless, is there anything left for the information theory gang to do? Have we pushed this frontier to its limits?
I would suggest not. Innovation in this space suggests a little renaissance period in channel coding is coming because of requirements for 5G. But first a look at how we got here.
Channel coding history
Channel coding is one of the main reasons our wireless networks work the way we like them to do—fast and error free. The general idea is simple. First pad the information/packet/bits at the source node with some redundant bits to be transmitted over the communication medium. Then, at the receiving end, exploit the redundancy of the extra padded information to overcome the side effects of the channel, e.g. randomness, noise, interference, etc.
This is a simplification, but the whole challenge in the decades-long channel coding research has been on developing the nexus of method that effectively creates and exploits such redundancy in the most perfect way possible. This perfection was defined by Claude Shannon in 1948 in his classical works that told us just how many error-free bits we could ever hope to send through a noisy, bandlimited channel.
+ Also on Network World: 5G is coming, and it is the future of mobile +
One of the very first breakthroughs in channel codes, so-called Golay Codes were introduced in 1949, and their practical implementation was deployed in NASA’s Voyager 1 and enabled hundreds of colored pictures of Jupiter and Saturn to be sent to Earth. The following decade experienced a quantum leap in the performance of wireless communications primarily driven by the introduction of Convolutional Codes in 1955 by Elias. The key trick was to perform a continuous encoding mechanism at the transmitter and Trellis-based decoding at the receiver, e.g. the well-known Viterbi algorithm.
This radical shift proved to offer substantial performance gain yet with increased processing complexity and power consumption. Supported over time by the ever-increasing computation gains as provided by Moore’s law, along with more power-efficient circuitry, Convolutional codes ascended as the de facto codes for 2G mobile communications, digital video and satellite communications.
Then came Turbo codes. The introduction of Turbo codes by Berrou in 1993 sent shockwaves through the telecommunications community because for the very first time we had a channel code that performed close to Shannon’s limit. The relatively low complexity for the performance it offers put Turbo codes at the core of the digital and mobile revolution (3G/4G) that started in early 2000s.
Everyone sighed and said we are all done here, but then a funny thing happened. There was an interesting rediscovery around 1999 of low-density parity check (LDPC) codes, which everyone forgot really worked well, too. These codes were initially invented by Gallagher in 1963, meaning that by 1999 this technology was largely available patent free. A nice differentiator when compared to Turbo codes that were licensed by France Telecom until patent expiry in 2013.
Today: Turbo codes vs. LDPC codes
This brings us to where we are today: an ongoing heavyweight tussle between Turbo codes and LDPC codes, each claiming victory over the other in various use cases and applications. These codes are both so wonderful in their performance that it is quite reasonable to ask the question: Are we done in the channel coding space?
I don’t believe so, and the reason is simple. It is all about the use cases. Remember, each technology generation is driven by new use cases and new technical requirements. 2G was about voice and very low data rates. 3G and 4G were increasingly more about the mobile internet and video. Turbo Codes and LDPC have served perfectly up to this point and will very likely do for a good while longer, but the requirements coming down the pipe for 5G are a lot more than just voice and video. These requirements are all over the use case map. Turbo and LDPC codes are unproven or are already known to fall short in many of these new applications, opening the door once again to another surprise.
Enter Polar codes
Lucky enough, consistent with the previous timeline of channel coding surprises and breakthrough achievements in history, some exciting research has once again emerged. Invented by Arikan in 2009, Polar codes are the first class of codes that are explicitly proven (not only demonstrated/simulated in some cases) to achieve channel-capacity within an implementable complexity. In other words, compared to LDPC and Turbo codes, which are demonstrated to perform close to channel capacity in some scenarios particularly within the interest of today’s systems and their requirements, Polar codes guarantee highest performance for any region of interest, in any applications.
Without considering any fundamental issues in coding and overall system design, the story would end here. However, that is once again not the case (fortunately or unfortunately, depending on your angle of interest in this space). The stellar throughput and bit-error-rate performance of today’s most practical Polar codes come with the expense of slightly higher latency at the receiving end due to the inherent nature of the code construction. Moreover, the complexity of generating Polar codes at the transmitter end and also decoding at the receiving end still looks beyond the implementation capacity for a nearer-term timeline of interest, though they still provide the best performance under the same complexity requirements.
The excitement in Polar codes is still fresh for many reasons. First of all, Polar codes were invented fairly recently and the first round of research focus has been on establishing the theoretical foundations of these codes, which demonstrates significant potential. This includes a new code construction framework and tools that will potentially allow further research to bring these codes into the frame as a true candidate for beyond 4G (maybe 5G) channel codes.
Moreover, the practical implementation phase of Polar codes is just about to start, which will provide us the final word on the realistic performance of these codes, as was the case for Turbo codes and LDPC codes before them.
Only time (and lots of hard work) will tell whether Polar codes will establish themselves as the 5G code de la génération. Regardless, this innovation suggests that we are at the cusp of a little renaissance period in channel coding. This renaissance is being stimulated because the requirements goal posts are being shifted so enormously in 5G. This opens up whole new possibilities for innovation not just in channel coding, but in many other areas, too. Innovation in the wireless industry has never been more alive.
This article is published as part of the IDG Contributor Network. Want to Join?