17 research outputs found

    Layered architecture for quantum computing

    Full text link
    We develop a layered quantum computer architecture, which is a systematic framework for tackling the individual challenges of developing a quantum computer while constructing a cohesive device design. We discuss many of the prominent techniques for implementing circuit-model quantum computing and introduce several new methods, with an emphasis on employing surface code quantum error correction. In doing so, we propose a new quantum computer architecture based on optical control of quantum dots. The timescales of physical hardware operations and logical, error-corrected quantum gates differ by several orders of magnitude. By dividing functionality into layers, we can design and analyze subsystems independently, demonstrating the value of our layered architectural approach. Using this concrete hardware platform, we provide resource analysis for executing fault-tolerant quantum algorithms for integer factoring and quantum simulation, finding that the quantum dot architecture we study could solve such problems on the timescale of days.Comment: 27 pages, 20 figure

    River by Design: Essays on the Boise River, 1915-2015

    Get PDF
    River by Design marks 100 years since the Boise River emerged as an engineering sensation with the dedication of Arrowrock Dam. Sequenced like a tour with stops in Boise, Garden City, Eagle, Caldwell, and Parma, these essays collectively search for the politics and cultural values that drive engineering design.https://scholarworks.boisestate.edu/fac_books/1450/thumbnail.jp

    Development of a Core Outcome Set for effectiveness trials aimed at optimising prescribing in older adults in care homes

    Get PDF
    Background: Prescribing medicines for older adults in care homes is known to be sub-optimal. Whilst trials testing interventions to optimise prescribing in this setting have been published, heterogeneity in outcome reporting has hindered comparison of interventions, thus limiting evidence synthesis. The aim of this study was to develop a core outcome set (COS), a list of outcomes which should be measured and reported, as a minimum, for all effectiveness trials involving optimising prescribing in care homes. The COS was developed as part of the Care Homes Independent Pharmacist Prescribing Study (CHIPPS). Methods: A long-list of outcomes was identified through a review of published literature and stakeholder input. Outcomes were reviewed and refined prior to entering a two-round online Delphi exercise and then distributed via a web link to the CHIPPS Management Team, a multidisciplinary team including pharmacists, doctors and Patient Public Involvement representatives (amongst others), who comprised the Delphi panel. The Delphi panellists (n = 19) rated the importance of outcomes on a 9-point Likert scale from 1 (not important) to 9 (critically important). Consensus for an outcome being included in the COS was defined as ≥70% participants scoring 7–9 and <15% scoring 1–3. Exclusion was defined as ≥70% scoring 1–3 and <15% 7–9. Individual and group scores were fed back to participants alongside the second questionnaire round, which included outcomes for which no consensus had been achieved. Results: A long-list of 63 potential outcomes was identified. Refinement of this long-list of outcomes resulted in 29 outcomes, which were included in the Delphi questionnaire (round 1). Following both rounds of the Delphi exercise, 13 outcomes (organised into seven overarching domains: medication appropriateness, adverse drug events, prescribing errors, falls, quality of life, all-cause mortality and admissions to hospital (and associated costs)) met the criteria for inclusion in the final COS. Conclusions: We have developed a COS for effectiveness trials aimed at optimising prescribing in older adults in care homes using robust methodology. Widespread adoption of this COS will facilitate evidence synthesis between trials. Future work should focus on evaluating appropriate tools for these key outcomes to further reduce heterogeneity in outcome measurement in this context

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    The Effect of Communication Costs in Solid-State Quantum Computing Architectures

    No full text
    Quantum computation has become an intriguing technology with which to attack difficult problems and to enhance system security. Quantum algorithms, however, have been analyzed under idealized assumptions without important physical constraints in mind. In this paper, we analyze two key constraints: the short spatial distance of quantum interactions and the short temporal life of quantum data. In particular, quantum computations must make use of extremely robust error correction techniques to extend the life of quantum data. We present optimized spatial layouts of quantum error correction circuits for quantum bits embedded in silicon. We analyze the complexity of error correction under the constraint that interaction between these bits is near neighbor and data must be propagated via swap operations from one part of the circuit to another. We discover two interesting results from our quantum layouts. First, the recursive nature of quantum error correction circuits requires a additional communication technique more powerful than near-neighbor swaps – too much error accumulates if we attempt to swap over long distances. We show that quantum teleportation can be used to implement recursive structures. We also show that the reliability of the quantum swap operation is the limiting factor in solid-state quantum computation

    Synchroscalar: Initial Lessons in Power-Aware Design of a Tile-Based Embedded Architecture

    Get PDF
    Embedded devices have hard performance targets and severe power and area constraints that depart significantly from our design intuitions derived from general-purpose microprocessor design. This paper describes our initial experiences in designing Synchroscalar, a tile-based embedded architecture targeted for multi-rate signal processing applications. We present a preliminary design of the Synchroscalar architecture and some design space exploration in the context of important signal processing kernels. In particular, we find that synchronous design and substantial global interconnect are desirable in the low-frequency, low-power domain. This global interconnect enables parallelization and reduces processor idle time, which are critical to energy efficient implementations of high bandwidth signal processing. Furthermore, statically-scheduled communication and SIMD computation keep control overheads low and energy efficiency high

    The Effect of Communication Costs inSolid-State Quantum Computing Architectures

    No full text
    ABSTRACT Quantum computation has become an intriguing technology withwhich to attack difficult problems and to enhance system security. Quantum algorithms, however, have been analyzed under idealizedassumptions without important physical constraints in mind. In this paper, we analyze two key constraints: the short spatial distance ofquantum interactions and the short temporal life of quantum data
    corecore