29 research outputs found
Possibilities for improving the efficiency of linear error-correcting codes
Results are presented in the form of 14 theorems specifying sufficient conditions under which it is possible to construct new more efficient single- and multi error correcting codes from existing one when the method used to construct existing codes is known
Multiple Particle Interference and Quantum Error Correction
The concept of multiple particle interference is discussed, using insights
provided by the classical theory of error correcting codes. This leads to a
discussion of error correction in a quantum communication channel or a quantum
computer. Methods of error correction in the quantum regime are presented, and
their limitations assessed. A quantum channel can recover from arbitrary
decoherence of x qubits if K bits of quantum information are encoded using n
quantum bits, where K/n can be greater than 1-2 H(2x/n), but must be less than
1 - 2 H(x/n). This implies exponential reduction of decoherence with only a
polynomial increase in the computing resources required. Therefore quantum
computation can be made free of errors in the presence of physically realistic
levels of decoherence. The methods also allow isolation of quantum
communication from noise and evesdropping (quantum privacy amplification).Comment: Submitted to Proc. Roy. Soc. Lond. A. in November 1995, accepted May
1996. 39 pages, 6 figures. This is now the final version. The changes are
some added references, changed final figure, and a more precise use of the
word `decoherence'. I would like to propose the word `defection' for a
general unknown error of a single qubit (rotation and/or entanglement). It is
useful because it captures the nature of the error process, and has a verb
form `to defect'. Random unitary changes (rotations) of a qubit are caused by
defects in the quantum computer; to entangle randomly with the environment is
to form a treacherous alliance with an enemy of successful quantu
Accuracy thresholds of topological color codes on the hexagonal and square-octagonal lattices
Accuracy thresholds of quantum error correcting codes, which exploit
topological properties of systems, defined on two different arrangements of
qubits are predicted. We study the topological color codes on the hexagonal
lattice and on the square-octagonal lattice by the use of mapping into the spin
glass systems. The analysis for the corresponding spin glass systems consists
of the duality, and the gauge symmetry, which has succeeded in deriving
locations of special points, which are deeply related with the accuracy
thresholds of topological error correcting codes. We predict that the accuracy
thresholds for the topological color codes would be for the
hexagonal lattice and for the square-octagonal lattice,
where denotes the error probability on each qubit. Hence both of them are
expected to be slightly lower than the probability for the
quantum Gilbert-Varshamov bound with a zero encoding rate.Comment: 6 pages, 4 figures, the previous title was "Threshold of topological
color code". This is the published version in Phys. Rev.
Partial tests, universal tests and decomposability
For a property P and a sub-property P', we say that P is P'-partially testable with q queries} if there exists an algorithm that distinguishes, with high probability, inputs in P' from inputs ε-far from P, using q queries. Some natural properties require many queries to test, but can be partitioned into a small number of subsets for which they are partially testable with very few queries, sometimes even a number independent of the input size.
For properties over {0,1}, the notion of being thus partitionable ties in closely with Merlin-Arthur proofs of Proximity (MAPs) as defined independently in [14] a partition into r partially-testable properties is the same as a Merlin-Arthur system where the proof consists of the identity of one of the r partially-testable properties, giving a 2-way translation to an O(log r) size proof.
Our main result is that for some low complexity properties a partition as above cannot exist, and moreover that for each of our properties there does not exist even a single sub-property featuring both a large size and a query-efficient partial test, in particular improving the lower bound set in [14]. For this we use neither the traditional Yao-type arguments nor the more recent communication complexity method, but open up a new approach for proving lower bounds.
First, we use entropy analysis, which allows us to apply our arguments directly to 2-sided tests, thus avoiding the cost of the conversion in [14] from 2-sided to 1-sided tests. Broadly speaking we use "distinguishing instances" of a supposed test to show that a uniformly random choice of a member of the sub-property has "low entropy areas", ultimately leading to it having a low total entropy and hence having a small base set.
Additionally, to have our arguments apply to adaptive tests, we use a mechanism of "rearranging" the input bits (through a decision tree that adaptively reads the entire input) to expose the low entropy that would otherwise not be apparent.
We also explore the possibility of a connection in the other direction, namely whether the existence of a good partition (or MAP) can lead to a relatively query-efficient standard property test. We provide some preliminary results concerning this question, including a simple lower bound on the possible trade-off.
Our second major result is a positive trade-off result for the restricted framework of 1-sided proximity oblivious tests. This is achieved through the construction of a "universal tester" that works the same for all properties admitting the restricted test. Our tester is very related to the notion of sample-based testing (for a non-constant number of queries) as defined by Goldreich and Ron in [13]. In particular it partially resolves an open problem raised by [13]
Cyclic mutually unbiased bases, Fibonacci polynomials and Wiedemann's conjecture
We relate the construction of a complete set of cyclic mutually unbiased
bases, i. e., mutually unbiased bases generated by a single unitary operator,
in power-of-two dimensions to the problem of finding a symmetric matrix over
F_2 with an irreducible characteristic polynomial that has a given Fibonacci
index. For dimensions of the form 2^(2^k) we present a solution that shows an
analogy to an open conjecture of Wiedemann in finite field theory. Finally, we
discuss the equivalence of mutually unbiased bases.Comment: 11 pages, added chapter on equivalenc
Error Threshold for Color Codes and Random 3-Body Ising Models
We study the error threshold of color codes, a class of topological quantum
codes that allow a direct implementation of quantum Clifford gates suitable for
entanglement distillation, teleportation and fault-tolerant quantum
computation. We map the error-correction process onto a statistical mechanical
random 3-body Ising model and study its phase diagram via Monte Carlo
simulations. The obtained error threshold of p_c = 0.109(2) is very close to
that of Kitaev's toric code, showing that enhanced computational capabilities
does not necessarily imply lower resistance to noise.Comment: 4 pages, 3 figures, 1 tabl
Spectrum of Sizes for Perfect Deletion-Correcting Codes
One peculiarity with deletion-correcting codes is that perfect
-deletion-correcting codes of the same length over the same alphabet can
have different numbers of codewords, because the balls of radius with
respect to the Levenshte\u{\i}n distance may be of different sizes. There is
interest, therefore, in determining all possible sizes of a perfect
-deletion-correcting code, given the length and the alphabet size~.
In this paper, we determine completely the spectrum of possible sizes for
perfect -ary 1-deletion-correcting codes of length three for all , and
perfect -ary 2-deletion-correcting codes of length four for almost all ,
leaving only a small finite number of cases in doubt.Comment: 23 page