206 research outputs found
Uncorrectable Errors of Weight Half the Minimum Distance for Binary Linear Codes
A lower bound on the number of uncorrectable errors of weight half the
minimum distance is derived for binary linear codes satisfying some condition.
The condition is satisfied by some primitive BCH codes, extended primitive BCH
codes, Reed-Muller codes, and random linear codes. The bound asymptotically
coincides with the corresponding upper bound for Reed-Muller codes and random
linear codes. By generalizing the idea of the lower bound, a lower bound on the
number of uncorrectable errors for weights larger than half the minimum
distance is also obtained, but the generalized lower bound is weak for large
weights. The monotone error structure and its related notion larger half and
trial set, which are introduced by Helleseth, Kl{\o}ve, and Levenshtein, are
mainly used to derive the bounds.Comment: 5 pages, to appear in ISIT 200
The weight enumerators for certain subcodes of the second order binary Reed-Muller codes
In this paper we obtain formulas for the number of codewords of each weight in several classes of subcodes of the second order Reed-Muller codes. Our formulas are derived from the following results: (i) the weight enumerator of the second order RM code, as given by Berlekamp-Sloane (1970), (ii) the MacWilliams-Pless identities, (iii) a new result we present here (Theorem 1), (iv) the Carlitz-Uchiyama (1957) bound, and, (iv′) the BCH bound.The class of codes whose weight enumerators are determined includes subclasses whose weight enumerators were previously found by Kasami (1967–1969) and Berlekamp(1968a, b)
Permutation Decoding and the Stopping Redundancy Hierarchy of Cyclic and Extended Cyclic Codes
We introduce the notion of the stopping redundancy hierarchy of a linear
block code as a measure of the trade-off between performance and complexity of
iterative decoding for the binary erasure channel. We derive lower and upper
bounds for the stopping redundancy hierarchy via Lovasz's Local Lemma and
Bonferroni-type inequalities, and specialize them for codes with cyclic
parity-check matrices. Based on the observed properties of parity-check
matrices with good stopping redundancy characteristics, we develop a novel
decoding technique, termed automorphism group decoding, that combines iterative
message passing and permutation decoding. We also present bounds on the
smallest number of permutations of an automorphism group decoder needed to
correct any set of erasures up to a prescribed size. Simulation results
demonstrate that for a large number of algebraic codes, the performance of the
new decoding method is close to that of maximum likelihood decoding.Comment: 40 pages, 6 figures, 10 tables, submitted to IEEE Transactions on
Information Theor
- …