81 research outputs found
High performance entanglement-assisted quantum LDPC codes need little entanglement
Though the entanglement-assisted formalism provides a universal connection
between a classical linear code and an entanglement-assisted quantum
error-correcting code (EAQECC), the issue of maintaining large amount of pure
maximally entangled states in constructing EAQECCs is a practical obstacle to
its use. It is also conjectured that the power of entanglement-assisted
formalism to convert those good classical codes comes from massive consumption
of maximally entangled states. We show that the above conjecture is wrong by
providing families of EAQECCs with an entanglement consumption rate that
diminishes linearly as a function of the code length. Notably, two families of
EAQECCs constructed in the paper require only one copy of maximally entangled
state no matter how large the code length is. These families of EAQECCs that
are constructed from classical finite geometric LDPC codes perform very well
according to our numerical simulations. Our work indicates that EAQECCs are not
only theoretically interesting, but also physically implementable. Finally,
these high performance entanglement-assisted LDPC codes with low entanglement
consumption rates allow one to construct high-performance standard QECCs with
very similar parameters.Comment: 8 pages, 5 figures. Published versio
Entanglement-assisted quantum low-density parity-check codes
This paper develops a general method for constructing entanglement-assisted
quantum low-density parity-check (LDPC) codes, which is based on combinatorial
design theory. Explicit constructions are given for entanglement-assisted
quantum error-correcting codes (EAQECCs) with many desirable properties. These
properties include the requirement of only one initial entanglement bit, high
error correction performance, high rates, and low decoding complexity. The
proposed method produces infinitely many new codes with a wide variety of
parameters and entanglement requirements. Our framework encompasses various
codes including the previously known entanglement-assisted quantum LDPC codes
having the best error correction performance and many new codes with better
block error rates in simulations over the depolarizing channel. We also
determine important parameters of several well-known classes of quantum and
classical LDPC codes for previously unsettled cases.Comment: 20 pages, 5 figures. Final version appearing in Physical Review
The Road From Classical to Quantum Codes: A Hashing Bound Approaching Design Procedure
Powerful Quantum Error Correction Codes (QECCs) are required for stabilizing
and protecting fragile qubits against the undesirable effects of quantum
decoherence. Similar to classical codes, hashing bound approaching QECCs may be
designed by exploiting a concatenated code structure, which invokes iterative
decoding. Therefore, in this paper we provide an extensive step-by-step
tutorial for designing EXtrinsic Information Transfer (EXIT) chart aided
concatenated quantum codes based on the underlying quantum-to-classical
isomorphism. These design lessons are then exemplified in the context of our
proposed Quantum Irregular Convolutional Code (QIRCC), which constitutes the
outer component of a concatenated quantum code. The proposed QIRCC can be
dynamically adapted to match any given inner code using EXIT charts, hence
achieving a performance close to the hashing bound. It is demonstrated that our
QIRCC-based optimized design is capable of operating within 0.4 dB of the noise
limit
Enhanced Feedback Iterative Decoding of Sparse Quantum Codes
Decoding sparse quantum codes can be accomplished by syndrome-based decoding
using a belief propagation (BP) algorithm.We significantly improve this
decoding scheme by developing a new feedback adjustment strategy for the
standard BP algorithm. In our feedback procedure, we exploit much of the
information from stabilizers, not just the syndrome but also the values of the
frustrated checks on individual qubits of the code and the channel model.
Furthermore we show that our decoding algorithm is superior to belief
propagation algorithms using only the syndrome in the feedback procedure for
all cases of the depolarizing channel. Our algorithm does not increase the
measurement overhead compared to the previous method, as the extra information
comes for free from the requisite stabilizer measurements.Comment: 10 pages, 11 figures, Second version, To be appeared in IEEE
Transactions on Information Theor
High-Rate Quantum Low-Density Parity-Check Codes Assisted by Reliable Qubits
Quantum error correction is an important building block for reliable quantum information processing. A challenging hurdle in the theory of quantum error correction is that it is significantly more difficult to design error-correcting codes with desirable properties for quantum information processing than for traditional digital communications and computation. A typical obstacle to constructing a variety of strong quantum error-correcting codes is the complicated restrictions imposed on the structure of a code. Recently, promising solutions to this problem have been proposed in quantum information science, where in principle any binary linear code can be turned into a quantum error-correcting code by assuming a small number of reliable quantum bits. This paper studies how best to take advantage of these latest ideas to construct desirable quantum error-correcting codes of very high information rate. Our methods exploit structured high-rate low-density parity-check codes available in the classical domain and provide quantum analogues that inherit their characteristic low decoding complexity and high error correction performance even at moderate code lengths. Our approach to designing high-rate quantum error-correcting codes also allows for making direct use of other major syndrome decoding methods for linear codes, making it possible to deal with a situation where promising quantum analogues of low-density parity-check codes are difficult to find
Entanglement-assisted quantum turbo codes
An unexpected breakdown in the existing theory of quantum serial turbo coding
is that a quantum convolutional encoder cannot simultaneously be recursive and
non-catastrophic. These properties are essential for quantum turbo code
families to have a minimum distance growing with blocklength and for their
iterative decoding algorithm to converge, respectively. Here, we show that the
entanglement-assisted paradigm simplifies the theory of quantum turbo codes, in
the sense that an entanglement-assisted quantum (EAQ) convolutional encoder can
possess both of the aforementioned desirable properties. We give several
examples of EAQ convolutional encoders that are both recursive and
non-catastrophic and detail their relevant parameters. We then modify the
quantum turbo decoding algorithm of Poulin et al., in order to have the
constituent decoders pass along only "extrinsic information" to each other
rather than a posteriori probabilities as in the decoder of Poulin et al., and
this leads to a significant improvement in the performance of unassisted
quantum turbo codes. Other simulation results indicate that
entanglement-assisted turbo codes can operate reliably in a noise regime 4.73
dB beyond that of standard quantum turbo codes, when used on a memoryless
depolarizing channel. Furthermore, several of our quantum turbo codes are
within 1 dB or less of their hashing limits, so that the performance of quantum
turbo codes is now on par with that of classical turbo codes. Finally, we prove
that entanglement is the resource that enables a convolutional encoder to be
both non-catastrophic and recursive because an encoder acting on only
information qubits, classical bits, gauge qubits, and ancilla qubits cannot
simultaneously satisfy them.Comment: 31 pages, software for simulating EA turbo codes is available at
http://code.google.com/p/ea-turbo/ and a presentation is available at
http://markwilde.com/publications/10-10-EA-Turbo.ppt ; v2, revisions based on
feedback from journal; v3, modification of the quantum turbo decoding
algorithm that leads to improved performance over results in v2 and the
results of Poulin et al. in arXiv:0712.288
A characterization of entanglement-assisted quantum low-density parity-check codes
As in classical coding theory, quantum analogues of low-density parity-check
(LDPC) codes have offered good error correction performance and low decoding
complexity by employing the Calderbank-Shor-Steane (CSS) construction. However,
special requirements in the quantum setting severely limit the structures such
quantum codes can have. While the entanglement-assisted stabilizer formalism
overcomes this limitation by exploiting maximally entangled states (ebits),
excessive reliance on ebits is a substantial obstacle to implementation. This
paper gives necessary and sufficient conditions for the existence of quantum
LDPC codes which are obtainable from pairs of identical LDPC codes and consume
only one ebit, and studies the spectrum of attainable code parameters.Comment: 7 pages, no figures, final accepted version for publication in the
IEEE Transactions on Information Theor
Applications of finite geometries to designs and codes
This dissertation concerns the intersection of three areas of discrete mathematics: finite geometries, design theory, and coding theory. The central theme is the power of finite geometry designs, which are constructed from the points and t-dimensional subspaces of a projective or affine geometry. We use these designs to construct and analyze combinatorial objects which inherit their best properties from these geometric structures.
A central question in the study of finite geometry designs is Hamada’s conjecture, which proposes that finite geometry designs are the unique designs with minimum p-rank among all designs with the same parameters. In this dissertation, we will examine several questions related to Hamada’s conjecture, including the existence of counterexamples. We will also study the applicability of certain decoding methods to known counterexamples.
We begin by constructing an infinite family of counterexamples to Hamada’s conjecture. These designs are the first infinite class of counterexamples for the affine case of Hamada’s conjecture. We further demonstrate how these designs, along with the projective polarity designs of Jungnickel and Tonchev, admit majority-logic decoding schemes. The codes obtained from these polarity designs attain error-correcting performance which is, in certain cases, equal to that of the finite geometry designs from which they are derived. This further demonstrates the highly geometric structure maintained by these designs.
Finite geometries also help us construct several types of quantum error-correcting codes. We use relatives of finite geometry designs to construct infinite families of q-ary quantum stabilizer codes. We also construct entanglement-assisted quantum error-correcting codes (EAQECCs) which admit a particularly efficient and effective error-correcting scheme, while also providing the first general method for constructing these quantum codes with known parameters and desirable properties. Finite geometry designs are used to give exceptional examples of these codes
Multiplicativity of completely bounded -norms implies a strong converse for entanglement-assisted capacity
The fully quantum reverse Shannon theorem establishes the optimal rate of
noiseless classical communication required for simulating the action of many
instances of a noisy quantum channel on an arbitrary input state, while also
allowing for an arbitrary amount of shared entanglement of an arbitrary form.
Turning this theorem around establishes a strong converse for the
entanglement-assisted classical capacity of any quantum channel. This paper
proves the strong converse for entanglement-assisted capacity by a completely
different approach and identifies a bound on the strong converse exponent for
this task. Namely, we exploit the recent entanglement-assisted "meta-converse"
theorem of Matthews and Wehner, several properties of the recently established
sandwiched Renyi relative entropy (also referred to as the quantum Renyi
divergence), and the multiplicativity of completely bounded -norms due to
Devetak et al. The proof here demonstrates the extent to which the Arimoto
approach can be helpful in proving strong converse theorems, it provides an
operational relevance for the multiplicativity result of Devetak et al., and it
adds to the growing body of evidence that the sandwiched Renyi relative entropy
is the correct quantum generalization of the classical concept for all
.Comment: 21 pages, final version accepted for publication in Communications in
Mathematical Physic
- …