1,811 research outputs found

    Coding theorems for turbo code ensembles

    Get PDF
    This paper is devoted to a Shannon-theoretic study of turbo codes. We prove that ensembles of parallel and serial turbo codes are "good" in the following sense. For a turbo code ensemble defined by a fixed set of component codes (subject only to mild necessary restrictions), there exists a positive number γ0 such that for any binary-input memoryless channel whose Bhattacharyya noise parameter is less than γ0, the average maximum-likelihood (ML) decoder block error probability approaches zero, at least as fast as n -β, where β is the "interleaver gain" exponent defined by Benedetto et al. in 1996

    Mathematical Programming Decoding of Binary Linear Codes: Theory and Algorithms

    Full text link
    Mathematical programming is a branch of applied mathematics and has recently been used to derive new decoding approaches, challenging established but often heuristic algorithms based on iterative message passing. Concepts from mathematical programming used in the context of decoding include linear, integer, and nonlinear programming, network flows, notions of duality as well as matroid and polyhedral theory. This survey article reviews and categorizes decoding methods based on mathematical programming approaches for binary linear codes over binary-input memoryless symmetric channels.Comment: 17 pages, submitted to the IEEE Transactions on Information Theory. Published July 201

    Minimum Pseudoweight Analysis of 3-Dimensional Turbo Codes

    Full text link
    In this work, we consider pseudocodewords of (relaxed) linear programming (LP) decoding of 3-dimensional turbo codes (3D-TCs). We present a relaxed LP decoder for 3D-TCs, adapting the relaxed LP decoder for conventional turbo codes proposed by Feldman in his thesis. We show that the 3D-TC polytope is proper and CC-symmetric, and make a connection to finite graph covers of the 3D-TC factor graph. This connection is used to show that the support set of any pseudocodeword is a stopping set of iterative decoding of 3D-TCs using maximum a posteriori constituent decoders on the binary erasure channel. Furthermore, we compute ensemble-average pseudoweight enumerators of 3D-TCs and perform a finite-length minimum pseudoweight analysis for small cover degrees. Also, an explicit description of the fundamental cone of the 3D-TC polytope is given. Finally, we present an extensive numerical study of small-to-medium block length 3D-TCs, which shows that 1) typically (i.e., in most cases) when the minimum distance dmind_{\rm min} and/or the stopping distance hminh_{\rm min} is high, the minimum pseudoweight (on the additive white Gaussian noise channel) is strictly smaller than both the dmind_{\rm min} and the hminh_{\rm min}, and 2) the minimum pseudoweight grows with the block length, at least for small-to-medium block lengths.Comment: To appear in IEEE Transactions on Communication

    Local Optimality Certificates for LP Decoding of Tanner Codes

    Full text link
    We present a new combinatorial characterization for local optimality of a codeword in an irregular Tanner code. The main novelty in this characterization is that it is based on a linear combination of subtrees in the computation trees. These subtrees may have any degree in the local code nodes and may have any height (even greater than the girth). We expect this new characterization to lead to improvements in bounds for successful decoding. We prove that local optimality in this new characterization implies ML-optimality and LP-optimality, as one would expect. Finally, we show that is possible to compute efficiently a certificate for the local optimality of a codeword given an LLR vector

    Diagnosis of weaknesses in modern error correction codes: a physics approach

    Full text link
    One of the main obstacles to the wider use of the modern error-correction codes is that, due to the complex behavior of their decoding algorithms, no systematic method which would allow characterization of the Bit-Error-Rate (BER) is known. This is especially true at the weak noise where many systems operate and where coding performance is difficult to estimate because of the diminishingly small number of errors. We show how the instanton method of physics allows one to solve the problem of BER analysis in the weak noise range by recasting it as a computationally tractable minimization problem.Comment: 9 pages, 8 figure
    corecore