25 research outputs found
Entanglement-assisted quantum turbo codes
An unexpected breakdown in the existing theory of quantum serial turbo coding
is that a quantum convolutional encoder cannot simultaneously be recursive and
non-catastrophic. These properties are essential for quantum turbo code
families to have a minimum distance growing with blocklength and for their
iterative decoding algorithm to converge, respectively. Here, we show that the
entanglement-assisted paradigm simplifies the theory of quantum turbo codes, in
the sense that an entanglement-assisted quantum (EAQ) convolutional encoder can
possess both of the aforementioned desirable properties. We give several
examples of EAQ convolutional encoders that are both recursive and
non-catastrophic and detail their relevant parameters. We then modify the
quantum turbo decoding algorithm of Poulin et al., in order to have the
constituent decoders pass along only "extrinsic information" to each other
rather than a posteriori probabilities as in the decoder of Poulin et al., and
this leads to a significant improvement in the performance of unassisted
quantum turbo codes. Other simulation results indicate that
entanglement-assisted turbo codes can operate reliably in a noise regime 4.73
dB beyond that of standard quantum turbo codes, when used on a memoryless
depolarizing channel. Furthermore, several of our quantum turbo codes are
within 1 dB or less of their hashing limits, so that the performance of quantum
turbo codes is now on par with that of classical turbo codes. Finally, we prove
that entanglement is the resource that enables a convolutional encoder to be
both non-catastrophic and recursive because an encoder acting on only
information qubits, classical bits, gauge qubits, and ancilla qubits cannot
simultaneously satisfy them.Comment: 31 pages, software for simulating EA turbo codes is available at
http://code.google.com/p/ea-turbo/ and a presentation is available at
http://markwilde.com/publications/10-10-EA-Turbo.ppt ; v2, revisions based on
feedback from journal; v3, modification of the quantum turbo decoding
algorithm that leads to improved performance over results in v2 and the
results of Poulin et al. in arXiv:0712.288
Entanglement-assisted quantum turbo codes
An unexpected breakdown in the existing theory of quantum serial turbo coding is that a quantum convolutional encoder cannot simultaneously be recursive and non-catastrophic. These properties are essential for quantum turbo code families to have a minimum distance growing with blocklength and for their iterative decoding algorithm to converge, respectively. Here, we show that the entanglement-assisted paradigm simplifies the theory of quantum turbo codes, in the sense that an entanglement-assisted quantum (EAQ) convolutional encoder can possess both of the aforementioned desirable properties. We give several examples of EAQ convolutional encoders that are both recursive and non-catastrophic and detail their relevant parameters. We then modify the quantum turbo decoding algorithm of Poulin , in order to have the constituent decoders pass along only extrinsic information to each other rather than a posteriori probabilities as in the decoder of Poulin , and this leads to a significant improvement in the performance of unassisted quantum turbo codes. Other simulation results indicate that entanglement-assisted turbo codes can operate reliably in a noise regime 4.73 dB beyond that of standard quantum turbo codes, when used on a memoryless depolarizing channel. Furthermore, several of our quantum turbo codes are within 1 dB or less of their hashing limits, so that the performance of quantum turbo codes is now on par with that of classical turbo codes. Finally, we prove that entanglement is the resource that enables a convolutional encoder to be both non-catastrophic and recursive because an encoder acting on only information qubits, classical bits, gauge qubits, and ancilla qubits cannot simultaneously satisfy them. © 1963-2012 IEEE
Entanglement-assisted codeword stabilized quantum codes with imperfect ebits
Quantum error correcting codes (QECCs) in quantum communi- cation systems has
been known to exhibit improved performance with the use of error-free
entanglement bits (ebits). In practical situations, ebits inevitably suffer
from errors, and as a result, the error-correcting capability of the code is
diminished. Prior studies have proposed two different schemes as a solu- tion.
One uses only one QECC to correct errors on the receiver's side (i.e., Bob) and
on the sender's side (i.e., Alice). The other uses different QECCs on each
side. In this paper, we present a method to correct errors on both sides by
using single nonadditive Entanglement-assisted codeword stabilized quantum
error correcting code(EACWS QECC). We use the property that the number of
effective error patterns decreases as much as the number of ebits. This
property results in a greater number of logical codewords using the same number
of physical qubits
Examples of minimal-memory, non-catastrophic quantum convolutional encoders
One of the most important open questions in the theory of quantum
convolutional coding is to determine a minimal-memory, non-catastrophic,
polynomial-depth convolutional encoder for an arbitrary quantum convolutional
code. Here, we present a technique that finds quantum convolutional encoders
with such desirable properties for several example quantum convolutional codes
(an exposition of our technique in full generality will appear elsewhere). We
first show how to encode the well-studied Forney-Grassl-Guha (FGG) code with an
encoder that exploits just one memory qubit (the former Grassl-Roetteler
encoder requires 15 memory qubits). We then show how our technique can find an
online decoder corresponding to this encoder, and we also detail the operation
of our technique on a different example of a quantum convolutional code.
Finally, the reduction in memory for the FGG encoder makes it feasible to
simulate the performance of a quantum turbo code employing it, and we present
the results of such simulations.Comment: 5 pages, 2 figures, Accepted for the International Symposium on
Information Theory 2011 (ISIT 2011), St. Petersburg, Russia; v2 has minor
change
Recursive quantum convolutional encoders are catastrophic: A simple proof
Poulin, Tillich, and Ollivier discovered an important separation between the
classical and quantum theories of convolutional coding, by proving that a
quantum convolutional encoder cannot be both non-catastrophic and recursive.
Non-catastrophicity is desirable so that an iterative decoding algorithm
converges when decoding a quantum turbo code whose constituents are quantum
convolutional codes, and recursiveness is as well so that a quantum turbo code
has a minimum distance growing nearly linearly with the length of the code,
respectively. Their proof of the aforementioned theorem was admittedly "rather
involved," and as such, it has been desirable since their result to find a
simpler proof. In this paper, we furnish a proof that is arguably simpler. Our
approach is group-theoretic---we show that the subgroup of memory states that
are part of a zero physical-weight cycle of a quantum convolutional encoder is
equivalent to the centralizer of its "finite-memory" subgroup (the subgroup of
memory states which eventually reach the identity memory state by identity
operator inputs for the information qubits and identity or Pauli-Z operator
inputs for the ancilla qubits). After proving that this symmetry holds for any
quantum convolutional encoder, it easily follows that an encoder is
non-recursive if it is non-catastrophic. Our proof also illuminates why this
no-go theorem does not apply to entanglement-assisted quantum convolutional
encoders---the introduction of shared entanglement as a resource allows the
above symmetry to be broken.Comment: 15 pages, 1 figure. v2: accepted into IEEE Transactions on
Information Theory with minor modifications. arXiv admin note: text overlap
with arXiv:1105.064
Multiplicativity of completely bounded -norms implies a strong converse for entanglement-assisted capacity
The fully quantum reverse Shannon theorem establishes the optimal rate of
noiseless classical communication required for simulating the action of many
instances of a noisy quantum channel on an arbitrary input state, while also
allowing for an arbitrary amount of shared entanglement of an arbitrary form.
Turning this theorem around establishes a strong converse for the
entanglement-assisted classical capacity of any quantum channel. This paper
proves the strong converse for entanglement-assisted capacity by a completely
different approach and identifies a bound on the strong converse exponent for
this task. Namely, we exploit the recent entanglement-assisted "meta-converse"
theorem of Matthews and Wehner, several properties of the recently established
sandwiched Renyi relative entropy (also referred to as the quantum Renyi
divergence), and the multiplicativity of completely bounded -norms due to
Devetak et al. The proof here demonstrates the extent to which the Arimoto
approach can be helpful in proving strong converse theorems, it provides an
operational relevance for the multiplicativity result of Devetak et al., and it
adds to the growing body of evidence that the sandwiched Renyi relative entropy
is the correct quantum generalization of the classical concept for all
.Comment: 21 pages, final version accepted for publication in Communications in
Mathematical Physic
On the Performance of Interleavers for Quantum Turbo Codes
Quantum turbo codes (QTC) have shown excellent error correction capabilities in the setting of quantum communication, achieving a performance less than 1 dB away from their corresponding hashing bounds. Existing QTCs have been constructed using uniform random interleavers. However, interleaver design plays an important role in the optimization of classical turbo codes. Consequently, inspired by the widely used classical-to-quantum isomorphism, this paper studies the integration of classical interleaving design methods into the paradigm of quantum turbo coding. Simulations results demonstrate that error floors in QTCs can be lowered significantly, while decreasing memory consumption, by proper interleaving design without increasing the overall decoding complexity of the system