304 research outputs found
Decoder-in-the-Loop: Genetic Optimization-based LDPC Code Design
LDPC code design tools typically rely on asymptotic code behavior and are
affected by an unavoidable performance degradation due to model imperfections
in the short length regime. We propose an LDPC code design scheme based on an
evolutionary algorithm, the Genetic Algorithm (GenAlg), implementing a
"decoder-in-the-loop" concept. It inherently takes into consideration the
channel, code length and the number of iterations while optimizing the
error-rate of the actual decoder hardware architecture. We construct short
length LDPC codes (i.e., the parity-check matrix) with error-rate performance
comparable to, or even outperforming that of well-designed standardized short
length LDPC codes over both AWGN and Rayleigh fading channels. Our proposed
algorithm can be used to design LDPC codes with special graph structures (e.g.,
accumulator-based codes) to facilitate the encoding step, or to satisfy any
other practical requirement. Moreover, GenAlg can be used to design LDPC codes
with the aim of reducing decoding latency and complexity, leading to coding
gains of up to dB and dB at BLER of for both AWGN and
Rayleigh fading channels, respectively, when compared to state-of-the-art short
LDPC codes. Also, we analyze what can be learned from the resulting codes and,
as such, the GenAlg particularly highlights design paradigms of short length
LDPC codes (e.g., codes with degree-1 variable nodes obtain very good results).Comment: in IEEE Access, 201
Decoder-in-the-Loop: Genetic Optimization- Based LDPC Code Design
LDPC code design tools typically rely on asymptotic code behavior and are affected by an unavoidable performance degradation due to model imperfections in the short length regime. We propose an LDPC code design scheme based on an evolutionary algorithm, the Genetic Algorithm (GenAlg), implementing a ``decoder-in-the-loop\u27\u27 concept. It inherently takes into consideration the channel, code length and the number of iterations while optimizing the error-rate of the actual decoder hardware architecture. We construct short length LDPC codes (i.e., the parity-check matrix) with error-rate performance comparable to, or even outperforming that of well-designed standardized short length LDPC codes over both AWGN and Rayleigh fading channels. Our proposed algorithm can be used to design LDPC codes with special graph structures (e.g., accumulator-based codes) to facilitate the encoding step, or to satisfy any other practical requirement. Moreover, GenAlg can be used to design LDPC codes with the aim of reducing decoding latency and complexity, leading to coding gains of up to 0:325 dB and 0:8 dB at BLER of 10¯⁵ for both AWGN and Rayleigh fading channels, respectively, when compared to state-of-the-art short LDPC codes. Also, we analyze what can be learned from the resulting codes and, as such, the GenAlg particularly highlights design paradigms of short length LDPC codes (e.g., codes with degree-1 variable nodes obtain very good results)
Boosting Learning for LDPC Codes to Improve the Error-Floor Performance
Low-density parity-check (LDPC) codes have been successfully commercialized
in communication systems due to their strong error correction capabilities and
simple decoding process. However, the error-floor phenomenon of LDPC codes, in
which the error rate stops decreasing rapidly at a certain level, presents
challenges for achieving extremely low error rates and deploying LDPC codes in
scenarios demanding ultra-high reliability. In this work, we propose training
methods for neural min-sum (NMS) decoders to eliminate the error-floor effect.
First, by leveraging the boosting learning technique of ensemble networks, we
divide the decoding network into two neural decoders and train the post decoder
to be specialized for uncorrected words that the first decoder fails to
correct. Secondly, to address the vanishing gradient issue in training, we
introduce a block-wise training schedule that locally trains a block of weights
while retraining the preceding block. Lastly, we show that assigning different
weights to unsatisfied check nodes effectively lowers the error-floor with a
minimal number of weights. By applying these training methods to standard LDPC
codes, we achieve the best error-floor performance compared to other decoding
methods. The proposed NMS decoder, optimized solely through novel training
methods without additional modules, can be integrated into existing LDPC
decoders without incurring extra hardware costs. The source code is available
at https://github.com/ghy1228/LDPC_Error_Floor .Comment: 17 pages, 10 figure
Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G
The next wave of wireless technologies is proliferating in connecting things
among themselves as well as to humans. In the era of the Internet of things
(IoT), billions of sensors, machines, vehicles, drones, and robots will be
connected, making the world around us smarter. The IoT will encompass devices
that must wirelessly communicate a diverse set of data gathered from the
environment for myriad new applications. The ultimate goal is to extract
insights from this data and develop solutions that improve quality of life and
generate new revenue. Providing large-scale, long-lasting, reliable, and near
real-time connectivity is the major challenge in enabling a smart connected
world. This paper provides a comprehensive survey on existing and emerging
communication solutions for serving IoT applications in the context of
cellular, wide-area, as well as non-terrestrial networks. Specifically,
wireless technology enhancements for providing IoT access in fifth-generation
(5G) and beyond cellular networks, and communication networks over the
unlicensed spectrum are presented. Aligned with the main key performance
indicators of 5G and beyond 5G networks, we investigate solutions and standards
that enable energy efficiency, reliability, low latency, and scalability
(connection density) of current and future IoT networks. The solutions include
grant-free access and channel coding for short-packet communications,
non-orthogonal multiple access, and on-device intelligence. Further, a vision
of new paradigm shifts in communication networks in the 2030s is provided, and
the integration of the associated new technologies like artificial
intelligence, non-terrestrial networks, and new spectra is elaborated. Finally,
future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&
- …