575 research outputs found
CRISP: Curriculum based Sequential Neural Decoders for Polar Code Family
Polar codes are widely used state-of-the-art codes for reliable communication
that have recently been included in the 5th generation wireless standards (5G).
However, there remains room for the design of polar decoders that are both
efficient and reliable in the short blocklength regime. Motivated by recent
successes of data-driven channel decoders, we introduce a novel
urculum based equential neural decoder for
olar codes (CRISP). We design a principled curriculum, guided by
information-theoretic insights, to train CRISP and show that it outperforms the
successive-cancellation (SC) decoder and attains near-optimal reliability
performance on the Polar(32,16) and Polar(64,22) codes. The choice of the
proposed curriculum is critical in achieving the accuracy gains of CRISP, as we
show by comparing against other curricula. More notably, CRISP can be readily
extended to Polarization-Adjusted-Convolutional (PAC) codes, where existing SC
decoders are significantly less reliable. To the best of our knowledge, CRISP
constructs the first data-driven decoder for PAC codes and attains near-optimal
performance on the PAC(32,16) code.Comment: 23 pages, 23 figures. ICML 202
List Autoencoder: Towards Deep Learning Based Reliable Transmission Over Noisy Channels
In this paper, we present list autoencoder (listAE) to mimic list decoding
used in classical coding theory. With listAE, the decoder network outputs a
list of decoded message word candidates. To train the listAE, a genie is
assumed to be available at the output of the decoder. A specific loss function
is proposed to optimize the performance of a genie-aided (GA) list decoding.
The listAE is a general framework and can be used with any AE architecture. We
propose a specific architecture, referred to as incremental-redundancy AE
(IR-AE), which decodes the received word on a sequence of component codes with
non-increasing rates. Then, the listAE is trained and evaluated with both IR-AE
and Turbo-AE. Finally, we employ cyclic redundancy check (CRC) codes to replace
the genie at the decoder output and obtain a CRC aided (CA) list decoder. Our
simulation results show that the IR-AE under CA list decoding demonstrates
meaningful coding gain over Turbo-AE and polar code at low block error rates
range.Comment: 6 pages with references and 7 figure
- …