3,713 research outputs found

    Topological quantum memory

    Get PDF
    We analyze surface codes, the topological quantum error-correcting codes introduced by Kitaev. In these codes, qubits are arranged in a two-dimensional array on a surface of nontrivial topology, and encoded quantum operations are associated with nontrivial homology cycles of the surface. We formulate protocols for error recovery, and study the efficacy of these protocols. An order-disorder phase transition occurs in this system at a nonzero critical value of the error rate; if the error rate is below the critical value (the accuracy threshold), encoded information can be protected arbitrarily well in the limit of a large code block. This phase transition can be accurately modeled by a three-dimensional Z_2 lattice gauge theory with quenched disorder. We estimate the accuracy threshold, assuming that all quantum gates are local, that qubits can be measured rapidly, and that polynomial-size classical computations can be executed instantaneously. We also devise a robust recovery procedure that does not require measurement or fast classical processing; however for this procedure the quantum gates are local only if the qubits are arranged in four or more spatial dimensions. We discuss procedures for encoding, measurement, and performing fault-tolerant universal quantum computation with surface codes, and argue that these codes provide a promising framework for quantum computing architectures.Comment: 39 pages, 21 figures, REVTe

    Fault-Tolerance of "Bad" Quantum Low-Density Parity Check Codes

    Full text link
    We discuss error-correction properties for families of quantum low-density parity check (LDPC) codes with relative distance that tends to zero in the limit of large blocklength. In particular, we show that any family of LDPC codes, quantum or classical, where distance scales as a positive power of the block length, dnαd \propto n^\alpha, α>0\alpha>0, can correct all errors with certainty if the error rate per (qu)bit is sufficiently small. We specifically analyze the case of LDPC version of the quantum hypergraph-product codes recently suggested by Tillich and Z\'emor. These codes are a finite-rate generalization of the toric codes, and, for sufficiently large quantum computers, offer an advantage over the toric codes.Comment: 4.5 pages, 1 figur

    Error-rate-agnostic decoding of topological stabilizer codes

    Get PDF
    Efficient high-performance decoding of topological stabilizer codes has the potential to crucially improve the balance between logical failure rates and the number and individual error rates of the constituent qubits. High-threshold maximum-likelihood decoders require an explicit error model for Pauli errors to decode a specific syndrome, whereas lower-threshold heuristic approaches such as minimum weight matching are "error agnostic". Here we consider an intermediate approach, formulating a decoder that depends on the bias, i.e., the relative probability of phase-flip to bit-flip errors, but is agnostic to error rate. Our decoder is based on counting the number and effective weight of the most likely error chains in each equivalence class of a given syndrome. We use Metropolis-based Monte Carlo sampling to explore the space of error chains and find unique chains, that are efficiently identified using a hash table. Using the error-rate invariance the decoder can sample chains effectively at an error rate which is higher than the physical error rate and without the need for "thermalization" between chains in different equivalence classes. Applied to the surface code and the XZZX code, the decoder matches maximum-likelihood decoders for moderate code sizes or low error rates. We anticipate that, because of the compressed information content per syndrome, it can be taken full advantage of in combination with machine-learning methods to extrapolate Monte Carlo-generated data.Comment: 15 pages, 9 figures; V2 Added analysis of low error-rate performanc

    Deep Q-learning decoder for depolarizing noise on the toric code

    Get PDF
    We present an AI-based decoding agent for quantum error correction of depolarizing noise on the toric code. The agent is trained using deep reinforcement learning (DRL), where an artificial neural network encodes the state-action Q-values of error-correcting XX, YY, and ZZ Pauli operations, occurring with probabilities pxp_x, pyp_y, and pzp_z, respectively. By learning to take advantage of the correlations between bit-flip and phase-flip errors, the decoder outperforms the minimum-weight-perfect-matching (MWPM) algorithm, achieving higher success rate and higher error threshold for depolarizing noise (pz=px=pyp_z = p_x = p_y), for code distances d9d\leq 9. The decoder trained on depolarizing noise also has close to optimal performance for uncorrelated noise and provides functional but sub-optimal decoding for biased noise (pzpx=pyp_z \neq p_x = p_y). We argue that the DRL-type decoder provides a promising framework for future practical error correction of topological codes, striking a balance between on-the-fly calculations, in the form of forward evaluation of a deep Q-network, and pre-training and information storage. The complete code, as well as ready-to-use decoders (pre-trained networks), can be found in the repository https://github.com/mats-granath/toric-RL-decoder.Comment: 8+10 pages, 10+8 figure

    A single-system account of the relationship between priming, recognition, and fluency.

    Get PDF
    A single-system computational model of priming and recognition was applied to studies that have looked at the relationship between priming, recognition, and fluency in continuous identification paradigms. The model was applied to 3 findings that have been interpreted as evidence for a multiple-systems account: (a) priming can occur for items not recognized; (b) the pattern of identification reaction times (RTs) to hits, misses, correct rejections, and false alarms can change as a function of recognition performance; and (c) fluency effects (shorter RTs to words judged old vs. judged new) and priming effects (shorter RTs to old vs. new words) can be observed in amnesic patients at levels comparable with healthy adults despite impaired or near-chance recognition. The authors' simulations suggest, contrary to previous interpretations, that these results are consistent with a single-system account

    Spin glass reflection of the decoding transition for quantum error correcting codes

    Get PDF
    We study the decoding transition for quantum error correcting codes with the help of a mapping to random-bond Wegner spin models. Families of quantum low density parity-check (LDPC) codes with a finite decoding threshold lead to both known models (e.g., random bond Ising and random plaquette Z2\Z2 gauge models) as well as unexplored earlier generally non-local disordered spin models with non-trivial phase diagrams. The decoding transition corresponds to a transition from the ordered phase by proliferation of extended defects which generalize the notion of domain walls to non-local spin models. In recently discovered quantum LDPC code families with finite rates the number of distinct classes of such extended defects is exponentially large, corresponding to extensive ground state entropy of these codes. Here, the transition can be driven by the entropy of the extended defects, a mechanism distinct from that in the local spin models where the number of defect types (domain walls) is always finite.Comment: 15 pages, 2 figure
    corecore