12 research outputs found

    The Future Possibility of Consumer-Grade Quantum Computers

    Get PDF
    Quantum computers are rapidly evolving and are on the edge of becoming useful for the first time. The theoretical limit of computational speed for quantum computers would put even small-scale quantum computers well ahead of any classical computer. With more researchers attempting to build their own, it has become a race to see who can create the first truly useful quantum computer. Once such computers become both useful and prevalent, massive advancements in many fields of science can be achieved, leading to a scientific revolution. Advances in quantum computing lead some researchers and consumers to question whether the technology can ever be adapted for a wide commercial market. Based on press releases, news sources, and recent research papers, this review surveys the possibility that the average consumer will able to use quantum computers for the first time. Information about upcoming projects, along with some conjecture, suggests that through cloud technology the average consumer may be able to use a quantum computer by 2030

    Characterizing the spatio-temporal qubit traffic of a quantum intranet aiming at modular quantum computer architectures

    Get PDF
    Quantum many-core processors are envisioned as the ultimate solution for the scalability of quantum computers. Based upon Noisy Intermediate-Scale Quantum (NISQ) chips interconnected in a sort of quantum intranet, they enable large algorithms to be executed on current and close future technology. In order to optimize such architectures, it is crucial to develop tools that allow specific design space explorations. To this aim, in this paper we present a technique to perform a spatio-temporal characterization of quantum circuits running in multi-chip quantum computers. Specifically, we focus on the analysis of the qubit traffic resulting from operations that involve qubits residing in different cores, and hence quantum communication across chips, while also giving importance to the amount of intra-core operations that occur in between those communications. Using specific multi-core performance metrics and a complete set of benchmarks, our analysis showcases the opportunities that the proposed approach may provide to guide the design of multi-core quantum computers and their interconnects.Peer ReviewedPostprint (author's final draft

    High-fidelity spin qubit operation and algorithmic initialization above 1 K

    Get PDF
    The encoding of qubits in semiconductor spin carriers has been recognized as a promising approach to a commercial quantum computer that can be lithographically produced and integrated at scale. However, the operation of the large number of qubits required for advantageous quantum applications will produce a thermal load exceeding the available cooling power of cryostats at millikelvin temperatures. As the scale-up accelerates, it becomes imperative to establish fault-tolerant operation above 1 K, at which the cooling power is orders of magnitude higher. Here we tune up and operate spin qubits in silicon above 1 K, with fidelities in the range required for fault-tolerant operations at these temperatures. We design an algorithmic initialization protocol to prepare a pure two-qubit state even when the thermal energy is substantially above the qubit energies and incorporate radiofrequency readout to achieve fidelities up to 99.34% for both readout and initialization. We also demonstrate single-qubit Clifford gate fidelities up to 99.85% and a two-qubit gate fidelity of 98.92%. These advances overcome the fundamental limitation that the thermal energy must be well below the qubit energies for the high-fidelity operation to be possible, surmounting a main obstacle in the pathway to scalable and fault-tolerant quantum computation

    La hoja de ruta de la ingeniería de computadores al final de la ley de Moore y el escalado de Dennard

    Get PDF
    En el presente trabajo se hace una revisión sobre la situación de la ingeniería de computadores al inicio de la década de los 2020 con objeto de perfilar algunos de los cambios que deberían establecerse en la enseñanza superior de esta disciplina. Se considera la gran relevancia del control del consumo energético y de las aplicaciones relacionadas con clasificación y optimización que requieren cantidades ingentes de datos (big data) y tiempos de respuesta difícilmente alcanzables utilizando las técnicas tradicionales de la ingeniería de computadores, y dada la reducción del ritmo que marca la ley de Moore y el final del escalado de Dennard. El artículo proporciona referencias bibliográficas recientes sobre la situación de la ingeniería de computadores, e identifica los nuevos requisitos de las interfaces presentes en la jerarquía de capas propia de los sistemas de cómputo, fundamentalmente los relacionados con la seguridad, el consumo energético, y el aprovechamiento del paralelismo heterogéneo. También se reflexiona sobre los límites teóricos que se pueden establecer para la computación y las expectativas que ofrece la computación cuántica.This paper reviews the state of Computer Engineering at the beginning of the 2020s in order to outline some of the changes that should be established in higher education in this discipline. It is considered the great relevance of controlling energy consumption and applications related to classification and optimization that require huge amounts of data (big data) and response times difficult to achieve using traditional techniques of computer engineering, and given the reduction of the improvement rate set by Moore's law and the end of Dennard scaling. The article also provides recent bibliographical references on the situation of Computer Engineering, and identifies the new requirements of the interfaces present in the hierarchy of layers of computer systems, mainly those related to security, energy consumption, and the use of heterogeneous parallelism. It also reflects on the theoretical limits that can be established for computation and the expectations that quantum computation offers.Universidad de Granada: Departamento de Arquitectura y Tecnología de Computadore

    Quantum state characterization with deep neural networks

    Get PDF
    In this licentiate thesis, I explain some of the interdisciplinary topics connecting machine learning to quantum physics. The thesis is based on the two appended papers, where deep neural networks were used for the characterization of quantum systems. I discuss the connections between parameter estimation, inverse problems and machine learning to put the results of the appended papers in perspective. In these papers, we have shown how to incorporate prior knowledge of quantum physics and noise models in generative adversarial neural networks. This thesis further discusses how automatic differentiation techniques allow training such custom neural-network-based methods to characterize quantum systems or learn their description. In the appended papers, we have demonstrated that the neural-network approach could learn a quantum state description from an order of magnitude fewer data points and faster than an iterative maximum-likelihood estimation technique. The goal of the thesis is to bring such tools and techniques from machine learning to the physicist’s arsenal and to explore the intersection between quantum physics and machine learning

    Quantum computing challenges in the software industry. A fuzzy AHP-based approach

    Get PDF
    ContextThe current technology revolution has posed unexpected challenges for the software industry. In recent years, the field of quantum computing (QC) technologies has continued to grow in influence and maturity, and it is now poised to revolutionise software engineering. However, the evaluation and prioritisation of QC challenges in the software industry remain unexplored, relatively under-identified and fragmented.ObjectiveThe purpose of this study is to identify, examine and prioritise the most critical challenges in the software industry by implementing a fuzzy analytic hierarchy process (F-AHP).MethodFirst, to identify the key challenges, we conducted a systematic literature review by drawing data from the four relevant digital libraries and supplementing these efforts with a forward and backward snowballing search. Second, we followed the F-AHP approach to evaluate and rank the identified challenges, or barriers.ResultsThe results show that the key barriers to QC adoption are the lack of technical expertise, information accuracy and organisational interest in adopting the new process. Another critical barrier is the lack of standards of secure communication techniques for implementing QC.ConclusionBy applying F-AHP, we identified institutional barriers as the highest and organisational barriers as the second highest global weight ranked categories among the main QC challenges facing the software industry. We observed that the highest-ranked local barriers facing the software technology industry are the lack of resources for design and initiative while the lack of organisational interest in adopting the new process is the most significant organisational barrier. Our findings, which entail implications for both academicians and practitioners, reveal the emergent nature of QC research and the increasing need for interdisciplinary research to address the identified challenges.</p

    Modern Approaches to Topological Quantum Error Correction

    Get PDF
    The construction of a large-scale fault-tolerant quantum computer is an outstanding scientific and technological goal. It holds the promise to allow us to solve a variety of complex problems such as factoring large numbers, quick database search, and the quantum simulation of many-body quantum systems in fields as diverse as condensed matter, quantum chemistry, and even high-energy physics. Sophisticated theoretical protocols for reliable quantum information processing under imperfect conditions have been de-veloped, when errors affect and corrupt the fragile quantum states during storage and computations. Arguably, the most realistic and promising ap-proach towards practical fault-tolerant quantum computation are topologi-cal quantum error-correcting codes, where quantum information is stored in interacting, topologically ordered 2D or 3D many-body quantum systems. This approach offers the highest known error thresholds, which are already today within reach of the experimental accuracy in state-of-the-art setups. A combination of theoretical and experimental research is needed to store, protect and process fragile quantum information in logical qubits effectively so that they can outperform their constituting physical qubits. Whereas small-scale quantum error correction codes have been implemented, one of the main theoretical challenges remains to develop new and improve existing efficient strategies (so-called decoders) to derive (near-)optimal error cor-rection operations in the presence of experimentally accessible measurement information and realistic noise sources. One main focus of this project is the development and numerical implementation of scalable, efficient decoders to operate topological color codes. Additionally, we study the feasibility of im-plementing quantum error-correcting codes fault-tolerantly in near-term ion traps. To this end, we use realistic modeling of the different noise sources, computer simulations, and most modern quantum information approaches to quantum circuitry and noise suppression techniques
    corecore