The invention of Polar codes by Arıkan is a major breakthrough in coding theory. Polar Code decoding algorithm implementation is a major challenge to recover transmitted information. Thus, several polar… Click to show full abstract
The invention of Polar codes by Arıkan is a major breakthrough in coding theory. Polar Code decoding algorithm implementation is a major challenge to recover transmitted information. Thus, several polar decoder architectures were proposed in the literature. All of these architectures focused on reducing the computational hardware complexity and increasing the throughput of polar decoders. However, the memory requirements remain a limiting implementation factor that has not been fully adressed yet. This paper proposes a novel method to simply redesign existing decoder architectures in order to use less memory at the cost of some extra computational logic. The main idea is to replace memory sections — assigned to store intermediate results — with computational logic. The method, applied to an existing decoder D$\mathcal {D}$, results in what is called a mixed decoder architecture based on D$\mathcal {D}$, denoted M(D)$M({\mathcal {D}})$. Since previous decoders are based on the semi-parallel decoder architecture, we first apply the memory requirement reduction technique to a semi-parallel decoder. Analyses, together with logic synthesis results, show that the gains brought by the reduction in memory area requirements are well worth the induced extra computational logic area. We show that the memory requirement reduction technique can increase the speed/area ratio by 25 % when implemented in standard cell technology (ST 65 nm). We also provide some insights on the potential gain that this method would provide on state-of-the-art decoders implemented on FPGA devices. For example, it is shown that the proposed method can lower the decoder memory requirements by 50 % while using less than 20 % of the FPGA logic elements, and implying a latency penalty of less than 5 %.
               
Click one of the above tabs to view related content.