283,508 research outputs found

    Quantum Computing and Nuclear Magnetic Resonance

    Full text link
    Quantum information processing is the use of inherently quantum mechanical phenomena to perform information processing tasks that cannot be achieved using conventional classical information technologies. One famous example is quantum computing, which would permit calculations to be performed that are beyond the reach of any conceivable conventional computer. Initially it appeared that actually building a quantum computer would be extremely difficult, but in the last few years there has been an explosion of interest in the use of techniques adapted from conventional liquid state nuclear magnetic resonance (NMR) experiments to build small quantum computers. After a brief introduction to quantum computing I will review the current state of the art, describe some of the topics of current interest, and assess the long term contribution of NMR studies to the eventual implementation of practical quantum computers capable of solving real computational problems.Comment: 8 pages pdf including 6 figures. Perspectives article commissioned by PhysChemCom

    Parallel computing for the finite element method

    Full text link
    A finite element method is presented to compute time harmonic microwave fields in three dimensional configurations. Nodal-based finite elements have been coupled with an absorbing boundary condition to solve open boundary problems. This paper describes how the modeling of large devices has been made possible using parallel computation, New algorithms are then proposed to implement this formulation on a cluster of workstations (10 DEC ALPHA 300X) and on a CRAY C98. Analysis of the computation efficiency is performed using simple problems. The electromagnetic scattering of a plane wave by a perfect electric conducting airplane is finally given as example

    NMR Quantum Computation

    Get PDF
    In this article I will describe how NMR techniques may be used to build simple quantum information processing devices, such as small quantum computers, and show how these techniques are related to more conventional NMR experiments.Comment: Pedagogical mini review of NMR QC aimed at NMR folk. Commissioned by Progress in NMR Spectroscopy (in press). 30 pages RevTex including 15 figures (4 low quality postscript images

    Algorithmic Cooling of Spins: A Practicable Method for Increasing Polarization

    Full text link
    An efficient technique to generate ensembles of spins that are highly polarized by external magnetic fields is the Holy Grail in Nuclear Magnetic Resonance (NMR) spectroscopy. Since spin-half nuclei have steady-state polarization biases that increase inversely with temperature, spins exhibiting high polarization biases are considered cool, even when their environment is warm. Existing spin-cooling techniques are highly limited in their efficiency and usefulness. Algorithmic cooling is a promising new spin-cooling approach that employs data compression methods in open systems. It reduces the entropy of spins on long molecules to a point far beyond Shannon's bound on reversible entropy manipulations (an information-theoretic version of the 2nd Law of Thermodynamics), thus increasing their polarization. Here we present an efficient and experimentally feasible algorithmic cooling technique that cools spins to very low temperatures even on short molecules. This practicable algorithmic cooling could lead to breakthroughs in high-sensitivity NMR spectroscopy in the near future, and to the development of scalable NMR quantum computers in the far future. Moreover, while the cooling algorithm itself is classical, it uses quantum gates in its implementation, thus representing the first short-term application of quantum computing devices.Comment: 24 pages (with annexes), 3 figures (PS). This version contains no major content changes: fixed bibliography & figures, modified acknowledgement

    Simulation modelling and visualisation: toolkits for building artificial worlds

    Get PDF
    Simulations users at all levels make heavy use of compute resources to drive computational simulations for greatly varying applications areas of research using different simulation paradigms. Simulations are implemented in many software forms, ranging from highly standardised and general models that run in proprietary software packages to ad hoc hand-crafted simulations codes for very specific applications. Visualisation of the workings or results of a simulation is another highly valuable capability for simulation developers and practitioners. There are many different software libraries and methods available for creating a visualisation layer for simulations, and it is often a difficult and time-consuming process to assemble a toolkit of these libraries and other resources that best suits a particular simulation model. We present here a break-down of the main simulation paradigms, and discuss differing toolkits and approaches that different researchers have taken to tackle coupled simulation and visualisation in each paradigm

    Validation of Ultrahigh Dependability for Software-Based Systems

    Get PDF
    Modern society depends on computers for a number of critical tasks in which failure can have very high costs. As a consequence, high levels of dependability (reliability, safety, etc.) are required from such computers, including their software. Whenever a quantitative approach to risk is adopted, these requirements must be stated in quantitative terms, and a rigorous demonstration of their being attained is necessary. For software used in the most critical roles, such demonstrations are not usually supplied. The fact is that the dependability requirements often lie near the limit of the current state of the art, or beyond, in terms not only of the ability to satisfy them, but also, and more often, of the ability to demonstrate that they are satisfied in the individual operational products (validation). We discuss reasons why such demonstrations cannot usually be provided with the means available: reliability growth models, testing with stable reliability, structural dependability modelling, as well as more informal arguments based on good engineering practice. We state some rigorous arguments about the limits of what can be validated with each of such means. Combining evidence from these different sources would seem to raise the levels that can be validated; yet this improvement is not such as to solve the problem. It appears that engineering practice must take into account the fact that no solution exists, at present, for the validation of ultra-high dependability in systems relying on complex software

    What is a quantum computer, and how do we build one?

    Full text link
    The DiVincenzo criteria for implementing a quantum computer have been seminal in focussing both experimental and theoretical research in quantum information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. The question is therefore what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that according to this definition a device is a quantum computer if it obeys the following four criteria: Any quantum computer must (1) have a quantum memory; (2) facilitate a controlled quantum evolution of the quantum memory; (3) include a method for cooling the quantum memory; and (4) provide a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault-tolerantly. We discuss various existing quantum computing paradigms, and how they fit within this framework. Finally, we lay out a roadmap for selecting an avenue towards building a quantum computer. This is summarized in a decision tree intended to help experimentalists determine the most natural paradigm given a particular physical implementation
    • …
    corecore