384 research outputs found

    Decoding of Interleaved Reed-Solomon Codes Using Improved Power Decoding

    Get PDF
    We propose a new partial decoding algorithm for mm-interleaved Reed--Solomon (IRS) codes that can decode, with high probability, a random error of relative weight 1−Rmm+11-R^{\frac{m}{m+1}} at all code rates RR, in time polynomial in the code length nn. For m>2m>2, this is an asymptotic improvement over the previous state-of-the-art for all rates, and the first improvement for R>1/3R>1/3 in the last 2020 years. The method combines collaborative decoding of IRS codes with power decoding up to the Johnson radius.Comment: 5 pages, accepted at IEEE International Symposium on Information Theory 201

    It'll probably work out: improved list-decoding through random operations

    Full text link
    In this work, we introduce a framework to study the effect of random operations on the combinatorial list-decodability of a code. The operations we consider correspond to row and column operations on the matrix obtained from the code by stacking the codewords together as columns. This captures many natural transformations on codes, such as puncturing, folding, and taking subcodes; we show that many such operations can improve the list-decoding properties of a code. There are two main points to this. First, our goal is to advance our (combinatorial) understanding of list-decodability, by understanding what structure (or lack thereof) is necessary to obtain it. Second, we use our more general results to obtain a few interesting corollaries for list decoding: (1) We show the existence of binary codes that are combinatorially list-decodable from 1/2−ϵ1/2-\epsilon fraction of errors with optimal rate Ω(ϵ2)\Omega(\epsilon^2) that can be encoded in linear time. (2) We show that any code with Ω(1)\Omega(1) relative distance, when randomly folded, is combinatorially list-decodable 1−ϵ1-\epsilon fraction of errors with high probability. This formalizes the intuition for why the folding operation has been successful in obtaining codes with optimal list decoding parameters; previously, all arguments used algebraic methods and worked only with specific codes. (3) We show that any code which is list-decodable with suboptimal list sizes has many subcodes which have near-optimal list sizes, while retaining the error correcting capabilities of the original code. This generalizes recent results where subspace evasive sets have been used to reduce list sizes of codes that achieve list decoding capacity

    Evading Subspaces Over Large Fields and Explicit List-decodable Rank-metric Codes

    Get PDF
    We construct an explicit family of linear rank-metric codes over any field F that enables efficient list decoding up to a fraction rho of errors in the rank metric with a rate of 1-rho-eps, for any desired rho in (0,1) and eps > 0. Previously, a Monte Carlo construction of such codes was known, but this is in fact the first explicit construction of positive rate rank-metric codes for list decoding beyond the unique decoding radius. Our codes are explicit subcodes of the well-known Gabidulin codes, which encode linearized polynomials of low degree via their values at a collection of linearly independent points. The subcode is picked by restricting the message polynomials to an F-subspace that evades certain structured subspaces over an extension field of F. These structured spaces arise from the linear-algebraic list decoder for Gabidulin codes due to Guruswami and Xing (STOC\u2713). Our construction is obtained by combining subspace designs constructed by Guruswami and Kopparty (FOCS\u2713) with subspace-evasive varieties due to Dvir and Lovett (STOC\u2712). We establish a similar result for subspace codes, which are a collection of subspaces, every pair of which have low-dimensional intersection, and which have received much attention recently in the context of network coding. We also give explicit subcodes of folded Reed-Solomon (RS) codes with small folding order that are list-decodable (in the Hamming metric) with optimal redundancy, motivated by the fact that list decoding RS codes reduces to list decoding such folded RS codes. However, as we only list decode a subcode of these codes, the Johnson radius continues to be the best known error fraction for list decoding RS codes

    Decoding of Repeated-Root Cyclic Codes up to New Bounds on Their Minimum Distance

    Full text link
    The well-known approach of Bose, Ray-Chaudhuri and Hocquenghem and its generalization by Hartmann and Tzeng are lower bounds on the minimum distance of simple-root cyclic codes. We generalize these two bounds to the case of repeated-root cyclic codes and present a syndrome-based burst error decoding algorithm with guaranteed decoding radius based on an associated folded cyclic code. Furthermore, we present a third technique for bounding the minimum Hamming distance based on the embedding of a given repeated-root cyclic code into a repeated-root cyclic product code. A second quadratic-time probabilistic burst error decoding procedure based on the third bound is outlined. Index Terms Bound on the minimum distance, burst error, efficient decoding, folded code, repeated-root cyclic code, repeated-root cyclic product cod
    • …
    corecore