6 research outputs found

    Quantized Guessing Random Additive Noise Decoding

    Full text link
    We introduce a soft-detection variant of Guessing Random Additive Noise Decoding (GRAND) called Quantized GRAND (QGRAND) that can efficiently decode any moderate redundancy block-code of any length in an algorithm that is suitable for highly parallelized implementation in hardware. QGRAND can avail of any level of quantized soft information, is established to be almost capacity achieving, and is shown to provide near maximum likelihood decoding performance when provided with five or more bits of soft information per received bit

    Segmented GRAND: Combining Sub-patterns in Near-ML Order

    Full text link
    The recently introduced maximum-likelihood (ML) decoding scheme called guessing random additive noise decoding (GRAND) has demonstrated a remarkably low time complexity in high signal-to-noise ratio (SNR) regimes. However, the complexity is not as low at low SNR regimes and low code rates. To mitigate this concern, we propose a scheme for a near-ML variant of GRAND called ordered reliability bits GRAND (or ORBGRAND), which divides codewords into segments based on the properties of the underlying code, generates sub-patterns for each segment consistent with the syndrome (thus reducing the number of inconsistent error patterns generated), and combines them in a near-ML order using two-level integer partitions of logistic weight. The numerical evaluation demonstrates that the proposed scheme, called segmented ORBGRAND, significantly reduces the average number of queries at any SNR regime. Moreover, the segmented ORBGRAND with abandonment also improves the error correction performance

    Iterative Soft-Input Soft-Output Decoding with Ordered Reliability Bits GRAND

    Full text link
    Guessing Random Additive Noise Decoding (GRAND) is a universal decoding algorithm that can be used to perform maximum likelihood decoding. It attempts to find the errors introduced by the channel by generating a sequence of possible error vectors in order of likelihood of occurrence and applying them to the received vector. Ordered reliability bits GRAND (ORBGRAND) integrates soft information received from the channel to refine the error vector sequence. In this work, ORBGRAND is modified to produce a soft output, to enable its use as an iterative soft-input soft-output (SISO) decoder. Three techniques specific to iterative GRAND-based decoding are then proposed to improve the error-correction performance and decrease computational complexity and latency. Using the OFEC code as a case study, the proposed techniques are evaluated, yielding substantial performance gain and astounding complexity reduction of 48\% to 85\% with respect to the baseline SISO ORBGRAND.Comment: Submitted to Globecom 202

    Guessing random additive noise decoding with soft detection symbol reliability information - SGRAND

    Get PDF
    We recently introduced a noise-centric algorithm, Guessing Random Additive Noise Decoding (GRAND), that identifies a Maximum Likelihood (ML) decoding for arbitrary code-books. GRAND has the unusual property that its complexity decreases as code-book rate increases. Here we provide an extension to GRAND, soft-GRAND (SGRAND), that incorporates soft detection symbol reliability information and identifies a ML decoding in that context. In particular, we assume symbols received from the channel are declared to be error free or to have been potentially subject to additive noise. SGRAND inherits desirable properties of GRAND, including being capacity achieving when used with random code-books, and having a complexity that reduces as the code-rate increases
    corecore