593,001 research outputs found

    Protected gates for topological quantum field theories

    Get PDF
    We study restrictions on locality-preserving unitary logical gates for topological quantum codes in two spatial dimensions. A locality-preserving operation is one which maps local operators to local operators --- for example, a constant-depth quantum circuit of geometrically local gates, or evolution for a constant time governed by a geometrically-local bounded-strength Hamiltonian. Locality-preserving logical gates of topological codes are intrinsically fault tolerant because spatially localized errors remain localized, and hence sufficiently dilute errors remain correctable. By invoking general properties of two-dimensional topological field theories, we find that the locality-preserving logical gates are severely limited for codes which admit non-abelian anyons; in particular, there are no locality-preserving logical gates on the torus or the sphere with M punctures if the braiding of anyons is computationally universal. Furthermore, for Ising anyons on the M-punctured sphere, locality-preserving gates must be elements of the logical Pauli group. We derive these results by relating logical gates of a topological code to automorphisms of the Verlinde algebra of the corresponding anyon model, and by requiring the logical gates to be compatible with basis changes in the logical Hilbert space arising from local F-moves and the mapping class group.Comment: 50 pages, many figures, v3: updated to match published versio

    Local exclusion and Lieb-Thirring inequalities for intermediate and fractional statistics

    Full text link
    In one and two spatial dimensions there is a logical possibility for identical quantum particles different from bosons and fermions, obeying intermediate or fractional (anyon) statistics. We consider applications of a recent Lieb-Thirring inequality for anyons in two dimensions, and derive new Lieb-Thirring inequalities for intermediate statistics in one dimension with implications for models of Lieb-Liniger and Calogero-Sutherland type. These inequalities follow from a local form of the exclusion principle valid for such generalized exchange statistics.Comment: Revised and accepted version. 49 pages, 2 figure

    Fault-Tolerant Quantum Computation with Local Gates

    Full text link
    I discuss how to perform fault-tolerant quantum computation with concatenated codes using local gates in small numbers of dimensions. I show that a threshold result still exists in three, two, or one dimensions when next-to-nearest-neighbor gates are available, and present explicit constructions. In two or three dimensions, I also show how nearest-neighbor gates can give a threshold result. In all cases, I simply demonstrate that a threshold exists, and do not attempt to optimize the error correction circuit or determine the exact value of the threshold. The additional overhead due to the fault-tolerance in both space and time is polylogarithmic in the error rate per logical gate.Comment: 14 pages, LaTe

    Entanglement purification for Quantum Computation

    Get PDF
    We show that thresholds for fault-tolerant quantum computation are solely determined by the quality of single-system operations if one allows for d-dimensional systems with 8≤d≤328 \leq d \leq 32. Each system serves to store one logical qubit and additional auxiliary dimensions are used to create and purify entanglement between systems. Physical, possibly probabilistic two-system operations with error rates up to 2/3 are still tolerable to realize deterministic high quality two-qubit gates on the logical qubits. The achievable error rate is of the same order of magnitude as of the single-system operations. We investigate possible implementations of our scheme for several physical set-ups.Comment: 4 pages, 1 figure; V2: references adde

    IMPROVING PERCEPTUAL DIMENSION OF KNOWLEDGE QUALITY BY AUDIT TECHNIQUES

    Get PDF
    This paper present the problems linked to the knowledge quality concept, taking into account the logical, the structural and the perceptual dimensions of knowledge quality. The logical dimension is based on data and software applications quality and can be improved by technical and computerized environment control audit. The structural dimension is discussed in connection with modularity, data base object model and redundancy check. To improve the perceptual dimension of knowledge quality we analyze the possibility of using the performance audit techniques. Thus way it can be offered to the managers the perception that data and knowledge have been well evaluated, in according with clear hypothesis, operational risks and with no missing analytical data. Two indicators, GPS - Quantitative Precision of the Supplier and TSD Total Stock Duration, are presented as examples of how the perceptual dimension can be improved by the performance audit.Knowledge Quality, Quality Dimensions, Perceptual Dimension, IT Audit

    Cohomology in Grothendieck Topologies and Lower Bounds in Boolean Complexity

    Full text link
    This paper is motivated by questions such as P vs. NP and other questions in Boolean complexity theory. We describe an approach to attacking such questions with cohomology, and we show that using Grothendieck topologies and other ideas from the Grothendieck school gives new hope for such an attack. We focus on circuit depth complexity, and consider only finite topological spaces or Grothendieck topologies based on finite categories; as such, we do not use algebraic geometry or manifolds. Given two sheaves on a Grothendieck topology, their "cohomological complexity" is the sum of the dimensions of their Ext groups. We seek to model the depth complexity of Boolean functions by the cohomological complexity of sheaves on a Grothendieck topology. We propose that the logical AND of two Boolean functions will have its corresponding cohomological complexity bounded in terms of those of the two functions using ``virtual zero extensions.'' We propose that the logical negation of a function will have its corresponding cohomological complexity equal to that of the original function using duality theory. We explain these approaches and show that they are stable under pullbacks and base change. It is the subject of ongoing work to achieve AND and negation bounds simultaneously in a way that yields an interesting depth lower bound.Comment: 70 pages, abstract corrected and modifie

    Multi-dimensional Type Theory: Rules, Categories, and Combinators for Syntax and Semantics

    Full text link
    We investigate the possibility of modelling the syntax and semantics of natural language by constraints, or rules, imposed by the multi-dimensional type theory Nabla. The only multiplicity we explicitly consider is two, namely one dimension for the syntax and one dimension for the semantics, but the general perspective is important. For example, issues of pragmatics could be handled as additional dimensions. One of the main problems addressed is the rather complicated repertoire of operations that exists besides the notion of categories in traditional Montague grammar. For the syntax we use a categorial grammar along the lines of Lambek. For the semantics we use so-called lexical and logical combinators inspired by work in natural logic. Nabla provides a concise interpretation and a sequent calculus as the basis for implementations.Comment: 20 page

    Leveling the Field: Talking Levels in Cognitive Science

    Get PDF
    Talk of levels is everywhere in cognitive science. Whether it is in terms of adjudicating longstanding debates or motivating foundational concepts, one cannot go far without hearing about the need to talk at different ‘levels’. Yet in spite of its widespread application and use, the concept of levels has received little sustained attention within cognitive science. This paper provides an analysis of the various ways the notion of levels has been deployed within cognitive science. The paper begins by introducing and motivating discussion via four representative accounts of levels. It then turns to outlining and relating the four accounts using two dimensions of comparison. The result is the creation of a conceptual framework that maps the logical space of levels talk, which offers an important step toward making sense of levels talk within cognitive science
    • …
    corecore