4,298 research outputs found

    Capacity-achieving ensembles for the binary erasure channel with bounded complexity

    Full text link
    We present two sequences of ensembles of non-systematic irregular repeat-accumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity per information bit. This is in contrast to all previous constructions of capacity-achieving sequences of ensembles whose complexity grows at least like the log of the inverse of the gap (in rate) to capacity. The new bounded complexity result is achieved by puncturing bits, and allowing in this way a sufficient number of state nodes in the Tanner graph representing the codes. We also derive an information-theoretic lower bound on the decoding complexity of randomly punctured codes on graphs. The bound holds for every memoryless binary-input output-symmetric channel and is refined for the BEC.Comment: 47 pages, 9 figures. Submitted to IEEE Transactions on Information Theor

    Capacity-Achieving Ensembles of Accumulate-Repeat-Accumulate Codes for the Erasure Channel with Bounded Complexity

    Full text link
    The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with {\em bounded complexity}, per information bit, of encoding and decoding. It also introduces symmetry properties which play a central role in the construction of capacity-achieving ensembles for the BEC with bounded complexity. The results here improve on the tradeoff between performance and complexity provided by previous constructions of capacity-achieving ensembles of codes defined on graphs. The superiority of ARA codes with moderate to large block length is exemplified by computer simulations which compare their performance with those of previously reported capacity-achieving ensembles of LDPC and IRA codes. The ARA codes also have the advantage of being systematic.Comment: Submitted to IEEE Trans. on Information Theory, December 1st, 2005. Includes 50 pages and 13 figure

    New Combinatorial Construction Techniques for Low-Density Parity-Check Codes and Systematic Repeat-Accumulate Codes

    Full text link
    This paper presents several new construction techniques for low-density parity-check (LDPC) and systematic repeat-accumulate (RA) codes. Based on specific classes of combinatorial designs, the improved code design focuses on high-rate structured codes with constant column weights 3 and higher. The proposed codes are efficiently encodable and exhibit good structural properties. Experimental results on decoding performance with the sum-product algorithm show that the novel codes offer substantial practical application potential, for instance, in high-speed applications in magnetic recording and optical communications channels.Comment: 10 pages; to appear in "IEEE Transactions on Communications

    Fingerprinting with Minimum Distance Decoding

    Full text link
    This work adopts an information theoretic framework for the design of collusion-resistant coding/decoding schemes for digital fingerprinting. More specifically, the minimum distance decision rule is used to identify 1 out of t pirates. Achievable rates, under this detection rule, are characterized in two distinct scenarios. First, we consider the averaging attack where a random coding argument is used to show that the rate 1/2 is achievable with t=2 pirates. Our study is then extended to the general case of arbitrary tt highlighting the underlying complexity-performance tradeoff. Overall, these results establish the significant performance gains offered by minimum distance decoding as compared to other approaches based on orthogonal codes and correlation detectors. In the second scenario, we characterize the achievable rates, with minimum distance decoding, under any collusion attack that satisfies the marking assumption. For t=2 pirates, we show that the rate 1−H(0.25)≈0.1881-H(0.25)\approx 0.188 is achievable using an ensemble of random linear codes. For t≥3t\geq 3, the existence of a non-resolvable collusion attack, with minimum distance decoding, for any non-zero rate is established. Inspired by our theoretical analysis, we then construct coding/decoding schemes for fingerprinting based on the celebrated Belief-Propagation framework. Using an explicit repeat-accumulate code, we obtain a vanishingly small probability of misidentification at rate 1/3 under averaging attack with t=2. For collusion attacks which satisfy the marking assumption, we use a more sophisticated accumulate repeat accumulate code to obtain a vanishingly small misidentification probability at rate 1/9 with t=2. These results represent a marked improvement over the best available designs in the literature.Comment: 26 pages, 6 figures, submitted to IEEE Transactions on Information Forensics and Securit
    • …
    corecore