18,465 research outputs found

    Quantum Error Correction beyond the Bounded Distance Decoding Limit

    Full text link
    In this paper, we consider quantum error correction over depolarizing channels with non-binary low-density parity-check codes defined over Galois field of size 2p2^p . The proposed quantum error correcting codes are based on the binary quasi-cyclic CSS (Calderbank, Shor and Steane) codes. The resulting quantum codes outperform the best known quantum codes and surpass the performance limit of the bounded distance decoder. By increasing the size of the underlying Galois field, i.e., 2p2^p, the error floors are considerably improved.Comment: To appear in IEEE Transactions on Information Theor

    List Decodability at Small Radii

    Full text link
    A′(n,d,e)A'(n,d,e), the smallest ℓ\ell for which every binary error-correcting code of length nn and minimum distance dd is decodable with a list of size ℓ\ell up to radius ee, is determined for all d≥2e−3d\geq 2e-3. As a result, A′(n,d,e)A'(n,d,e) is determined for all e≤4e\leq 4, except for 42 values of nn.Comment: to appear in Designs, Codes, and Cryptography (accepted October 2010

    The Stability of Quantum Concatenated Code Hamiltonians

    Full text link
    Protecting quantum information from the detrimental effects of decoherence and lack of precise quantum control is a central challenge that must be overcome if a large robust quantum computer is to be constructed. The traditional approach to achieving this is via active quantum error correction using fault-tolerant techniques. An alternative to this approach is to engineer strongly interacting many-body quantum systems that enact the quantum error correction via the natural dynamics of these systems. Here we present a method for achieving this based on the concept of concatenated quantum error correcting codes. We define a class of Hamiltonians whose ground states are concatenated quantum codes and whose energy landscape naturally causes quantum error correction. We analyze these Hamiltonians for robustness and suggest methods for implementing these highly unnatural Hamiltonians.Comment: 18 pages, small corrections and clarification

    End-to-End Error-Correcting Codes on Networks with Worst-Case Symbol Errors

    Full text link
    The problem of coding for networks experiencing worst-case symbol errors is considered. We argue that this is a reasonable model for highly dynamic wireless network transmissions. We demonstrate that in this setup prior network error-correcting schemes can be arbitrarily far from achieving the optimal network throughput. A new transform metric for errors under the considered model is proposed. Using this metric, we replicate many of the classical results from coding theory. Specifically, we prove new Hamming-type, Plotkin-type, and Elias-Bassalygo-type upper bounds on the network capacity. A commensurate lower bound is shown based on Gilbert-Varshamov-type codes for error-correction. The GV codes used to attain the lower bound can be non-coherent, that is, they do not require prior knowledge of the network topology. We also propose a computationally-efficient concatenation scheme. The rate achieved by our concatenated codes is characterized by a Zyablov-type lower bound. We provide a generalized minimum-distance decoding algorithm which decodes up to half the minimum distance of the concatenated codes. The end-to-end nature of our design enables our codes to be overlaid on the classical distributed random linear network codes [1]. Furthermore, the potentially intensive computation at internal nodes for the link-by-link error-correction is un-necessary based on our design.Comment: Submitted for publication. arXiv admin note: substantial text overlap with arXiv:1108.239

    Reconstruction Codes for DNA Sequences with Uniform Tandem-Duplication Errors

    Full text link
    DNA as a data storage medium has several advantages, including far greater data density compared to electronic media. We propose that schemes for data storage in the DNA of living organisms may benefit from studying the reconstruction problem, which is applicable whenever multiple reads of noisy data are available. This strategy is uniquely suited to the medium, which inherently replicates stored data in multiple distinct ways, caused by mutations. We consider noise introduced solely by uniform tandem-duplication, and utilize the relation to constant-weight integer codes in the Manhattan metric. By bounding the intersection of the cross-polytope with hyperplanes, we prove the existence of reconstruction codes with greater capacity than known error-correcting codes, which we can determine analytically for any set of parameters.Comment: 11 pages, 2 figures, Latex; version accepted for publicatio

    On Subsystem Codes Beating the Hamming or Singleton Bound

    Get PDF
    Subsystem codes are a generalization of noiseless subsystems, decoherence free subspaces, and quantum error-correcting codes. We prove a Singleton bound for GF(q)-linear subsystem codes. It follows that no subsystem code over a prime field can beat the Singleton bound. On the other hand, we show the remarkable fact that there exist impure subsystem codes beating the Hamming bound. A number of open problems concern the comparison in performance of stabilizer and subsystem codes. One of the open problems suggested by Poulin's work asks whether a subsystem code can use fewer syndrome measurements than an optimal MDS stabilizer code while encoding the same number of qudits and having the same distance. We prove that linear subsystem codes cannot offer such an improvement under complete decoding.Comment: 18 pages more densely packed than classically possibl
    • …
    corecore