12 research outputs found

    Job Management and Task Bundling

    Full text link
    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.Comment: 8 pages, 3 figures, LATTICE 2017 proceeding

    Gate Based Implementation of the Laplacian with BRGC Code for Universal Quantum Computers

    Full text link
    We study the gate-based implementation of the binary reflected Gray code (BRGC) and binary code of the unitary time evolution operator due to the Laplacian discretized on a lattice with periodic boundary conditions. We find that the resulting Trotter error is independent of system size for a fixed lattice spacing through the Baker-Campbell-Hausdorff formula. We then present our algorithm for building the BRGC quantum circuit. For an adiabatic evolution time tt with this circuit, and spectral norm error ϵ\epsilon, we find the circuit cost (number of gates) and depth required are \mc{O}(t^2 n A D /\epsilon) with n−3n-3 auxiliary qubits for a system with 2n2^n lattice points per dimension DD and particle number AA; an improvement over binary position encoding which requires an exponential number of nn-local operators. Further, under the reasonable assumption that [T,V][T,V] bounds Δt\Delta t, with TT the kinetic energy and VV a non-trivial potential, the cost of QFT (Quantum Fourier Transform ) implementation of the Laplacian scales as \mc{O}\left(n^2\right) with depth \mc{O}\left(n\right) while BRGC scales as \mc{O}\left(n\right), giving an advantage to the BRGC implementation

    Towards grounding nuclear physics in QCD

    Get PDF
    Exascale computing could soon enable a predictive theory of nuclear structure and reactions rooted in the Standard Model, with quantifiable and systematically improvable uncertainties. Such a predictive theory will help exploit experiments that use nucleons and nuclei as laboratories for testing the Standard Model and its limitations. Examples include direct dark matter detection, neutrinoless double beta decay, and searches for permanent electric dipole moments of the neutron and atoms. It will also help connect QCD to the properties of cold neutron stars and hot supernova cores. We discuss how a quantitative bridge between QCD and the properties of nuclei and nuclear matter will require a synthesis of lattice QCD (especially as applied to two- and three-nucleon interactions), effective field theory, and ab initio methods for solving the nuclear many-body problem. While there are significant challenges that must be addressed in developing this triad of theoretical tools, the rapid advance of computing is accelerating progress. In particular, we focus this review on the anticipated advances from lattice QCD and how these advances will impact few-body effective theories of nuclear physics by providing critical input, such as constraints on unknown low-energy constants of the effective (field) theories. We also review particular challenges that must be overcome for the successful application of lattice QCD for low-energy nuclear physics. We describe progress in developing few-body effective (field) theories of nuclear physics, with an emphasis on HOBET, a non-relativistic effective theory of nuclear physics, which is less common in the literature. We use the examples of neutrinoless double beta decay and the nuclear-matter equation of state to illustrate how the coupling of lattice QCD to effective theory might impact our understanding of symmetries and exotic astrophysical environments.Comment: v2: updated manuscript based upon community feedback and referee comments. Also, substantially re-written section on two-nucleon lattice QCD controversy. 53.5 pages plus a "few more" references; v1: Contribution to: The tower of effective (field) theories and the emergence of nuclear phenomena; 47 pages plus a "few" reference

    Job Management and Task Bundling

    No full text
    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users’ current workflows or executables
    corecore