20 research outputs found

    An Optimum Symbol-by Symbol decoding rule for linear codes

    Get PDF
    A decoding rule is presented which minimizes the probability of symbol error over a time-discrete memoryless channel for any linear error-correcting code when the code words are equiprobable. The complexity of this rule varies inversely with code rate, making the technique particularly attractive to high rate codes. Examples are given for both block and convolutional codes

    Fourier Domain Decoding Algorithm of Non-Binary LDPC codes for Parallel Implementation

    Full text link
    For decoding non-binary low-density parity check (LDPC) codes, logarithm-domain sum-product (Log-SP) algorithms were proposed for reducing quantization effects of SP algorithm in conjunction with FFT. Since FFT is not applicable in the logarithm domain, the computations required at check nodes in the Log-SP algorithms are computationally intensive. What is worth, check nodes usually have higher degree than variable nodes. As a result, most of the time for decoding is used for check node computations, which leads to a bottleneck effect. In this paper, we propose a Log-SP algorithm in the Fourier domain. With this algorithm, the role of variable nodes and check nodes are switched. The intensive computations are spread over lower-degree variable nodes, which can be efficiently calculated in parallel. Furthermore, we develop a fast calculation method for the estimated bits and syndromes in the Fourier domain.Comment: To appear in IEICE Trans. Fundamentals, vol.E93-A, no.11 November 201

    On APP Decoding

    Get PDF
    In this paper we show that APP decoding for a linear code C is optimum not for C, but for a minimum-distance-2 code which contains C as a subcode when the codewords of are transmitted with equal probability. However, APP decoding is shown to be a symptotically optimum for C for high SNR when C is a binary one-step orthogonolizable code with equiprobable codewords transmitted over the AWGN channel

    An efficient combination between Berlekamp-Massey and Hartmann Rudolph algorithms to decode BCH codes

    Get PDF
    In digital communication and storage systems, the exchange of data is achieved using a communication channel which is not completely reliable. Therefore, detection and correction of possible errors are required by adding redundant bits to information data. Several algebraic and heuristic decoders were designed to detect and correct errors. The Hartmann Rudolph (HR) algorithm enables to decode a sequence symbol by symbol. The HR algorithm has a high complexity, that's why we suggest using it partially with the algebraic hard decision decoder Berlekamp-Massey (BM). In this work, we propose a concatenation of Partial Hartmann Rudolph (PHR) algorithm and Berlekamp-Massey decoder to decode BCH (Bose-Chaudhuri-Hocquenghem) codes. Very satisfying results are obtained. For example, we have used only 0.54% of the dual space size for the BCH code (63,39,9) while maintaining very good decoding quality. To judge our results, we compare them with other decoders

    Stop criteria for retransmission termination in soft-combining algorithms, Journal of Telecommunications and Information Technology, 2001, nr 3

    Get PDF
    Soft-combining algorithms use retransmissions of the same codeword to improve the reliability of communication over very noisy channels. In this paper, soft-outputs from a maximum a posteriori (MAP) decoder are used as a priori information for decoding of retransmitted codewords. As all received words may not need the same number of retransmissions to achieve satisfactory reliability, a stop criterion to terminate retransmissions needs to be identified. As a first and very simple stop criterion, we propose an algorithm which uses the sign of the soft-output at the MAP decoder. The performance obtained with this stop criterion is compared with the one assuming a genius observer, which identifies otherwise undetectable errors. Since this technique needs always a particular number of initial retransmissions, we exploit cross-entropy between subsequent retransmissions as a more advanced but still simple stop criterion. Simulation results show that significant performance improvement can be gained with soft-combining techniques compared to simple hard or soft decision decoding. It also shows that the examined stop criteria perform very close to the optimistic case of a genius observer

    Soft decoding techniques for codes and lattices, including the Golay code and the Leech lattice

    Full text link
    corecore