12 research outputs found
Job Management and Task Bundling
High Performance Computing is often performed on scarce and shared computing
resources. To ensure computers are used to their full capacity, administrators
often incentivize large workloads that are not possible on smaller systems.
Measurements in Lattice QCD frequently do not scale to machine-size workloads.
By bundling tasks together we can create large jobs suitable for gigantic
partitions. We discuss METAQ and mpi_jm, software developed to dynamically
group computational tasks together, that can intelligently backfill to consume
idle time without substantial changes to users' current workflows or
executables.Comment: 8 pages, 3 figures, LATTICE 2017 proceeding
Gate Based Implementation of the Laplacian with BRGC Code for Universal Quantum Computers
We study the gate-based implementation of the binary reflected Gray code
(BRGC) and binary code of the unitary time evolution operator due to the
Laplacian discretized on a lattice with periodic boundary conditions. We find
that the resulting Trotter error is independent of system size for a fixed
lattice spacing through the Baker-Campbell-Hausdorff formula. We then present
our algorithm for building the BRGC quantum circuit. For an adiabatic evolution
time with this circuit, and spectral norm error , we find the
circuit cost (number of gates) and depth required are \mc{O}(t^2 n A D
/\epsilon) with auxiliary qubits for a system with lattice points
per dimension and particle number ; an improvement over binary position
encoding which requires an exponential number of -local operators. Further,
under the reasonable assumption that bounds , with the
kinetic energy and a non-trivial potential, the cost of QFT (Quantum
Fourier Transform ) implementation of the Laplacian scales as
\mc{O}\left(n^2\right) with depth \mc{O}\left(n\right) while BRGC scales as
\mc{O}\left(n\right), giving an advantage to the BRGC implementation
Towards grounding nuclear physics in QCD
Exascale computing could soon enable a predictive theory of nuclear structure
and reactions rooted in the Standard Model, with quantifiable and
systematically improvable uncertainties. Such a predictive theory will help
exploit experiments that use nucleons and nuclei as laboratories for testing
the Standard Model and its limitations. Examples include direct dark matter
detection, neutrinoless double beta decay, and searches for permanent electric
dipole moments of the neutron and atoms. It will also help connect QCD to the
properties of cold neutron stars and hot supernova cores. We discuss how a
quantitative bridge between QCD and the properties of nuclei and nuclear matter
will require a synthesis of lattice QCD (especially as applied to two- and
three-nucleon interactions), effective field theory, and ab initio methods for
solving the nuclear many-body problem. While there are significant challenges
that must be addressed in developing this triad of theoretical tools, the rapid
advance of computing is accelerating progress. In particular, we focus this
review on the anticipated advances from lattice QCD and how these advances will
impact few-body effective theories of nuclear physics by providing critical
input, such as constraints on unknown low-energy constants of the effective
(field) theories. We also review particular challenges that must be overcome
for the successful application of lattice QCD for low-energy nuclear physics.
We describe progress in developing few-body effective (field) theories of
nuclear physics, with an emphasis on HOBET, a non-relativistic effective theory
of nuclear physics, which is less common in the literature. We use the examples
of neutrinoless double beta decay and the nuclear-matter equation of state to
illustrate how the coupling of lattice QCD to effective theory might impact our
understanding of symmetries and exotic astrophysical environments.Comment: v2: updated manuscript based upon community feedback and referee
comments. Also, substantially re-written section on two-nucleon lattice QCD
controversy. 53.5 pages plus a "few more" references; v1: Contribution to:
The tower of effective (field) theories and the emergence of nuclear
phenomena; 47 pages plus a "few" reference
Recommended from our members
Harmonic Oscillator Based Effective Theory, Connecting LQCD to Nuclear Structure
This work focuses on construction of a bridge from QCD (quantum chromodynamics), the theory of quarks, gluons, and their interactions, to nuclear structure, an obvious but unattained objective ever since the introduction of QCD in 1973. The bridge footing on one side of the chasm is QCD in the non-perturbative regime, only now beginning to yield to massively parallel computation in a Monte-Carlo space-time lattice formulation of QCD called LQCD (lattice quantum chromodynamics) that is our only tool for such problems. The resulting trickle of information about the nucleon interaction comes in the form of a fuzzy spectrum for two nucleons in a periodic box. It can be expected that the spectrum will sharpen and even eventually include a spectrum for three nucleons in a box with the introduction of larger and faster supercomputers as well as more clever algorithms. Fundamentally though, limits on what can be accomplished in LQCD are set by the famous fermion sign problem. Results in LQCD are produced as a small residual of the sum of large positive and negative contributions from the Monte-Carlo trials and accuracy only improves slowly with the number of expensive trials.The bridge footing on the other side of the chasm is the configuration interaction shell model, which is commonly used for nuclear structure calculations from a microscopic Hamiltonian expressed in the colorless degrees of freedom of QCD we call nucleons. As currently executed, this method is a model, the two- and possibly three-body interaction in use lacking a rigorous connection to QCD or direct accounting for contributions from scattering outside the model space. Nucleons, like quarks, are fermions and a fermion sign like problem exists in these calculations as well. The configuration interaction shell model is formulated in an antisymmetrized harmonic oscillator basis that grows with the number of permutations of identical nucleons in the model space. However, fantastically e cient parallel sparse matrix techniques for finding low lying eigenstates exist, allowing quite large problems to be solved.One footing of the bridge is solid and the other is nearing completion. Construction of the bridge itself then faces three major problems addressed in this dissertation, construction of the effective nuclear interaction from observables, finite volume effects associated with the periodic volume in which LQCD results are calculated, and the construction of the A-body effective Hamiltonian from the two body effective interaction.An effective theory is a organized and complete parameterized approximation limited to and preserving the known symmetries of an underlying theory (QCD in this case), constrained to some regime (energies below the mass of the pion in this case), and expressed in degrees of freedom suitable for solving the problem at hand (nucleons in a harmonic oscillator basis below an energy cutoff for the nuclear structure problem). An effective theory has a formal relationship to the underlying theory that a model does not. Unlike a model, a small number of observables may be used to fix the lowest order expansion parameters of the effective theory approximation with the expectation that the approximation remains valid in other situations for which observables are not available.The first portion of this work focuses on the construction of a harmonic oscillator based effective theory (HOBET) from observables in a spherical harmonic oscillator basis. It builds on the prior work of Haxton, Song, and Luu in demonstrating the construction of an convergent effective theory from a known potential, establishing the form of the required effective theory expansion. The new work required the extension of HOBET to a theory no longer limited to bound states and with continuity in energy, enabling uniform treatment of bound and continuum states. Here the expansion parameters are instead derived from phase shift observables at continuum energies. A key insight developed during this work was the way in which the effective theory constructed at an energy is connected to the boundary constraints of the wave function. Using known techniques, Lu ̈scher’s method and the HAL QCD potential method, to transform the LQCD spectrum in periodic box to infinite volume phase shifts produces a successful mechanism for fitting the effective interaction without knowledge of the details of the potential.The techniques for converting LQCD results to phase shifts have issues such as uncontrolled systematics related to the volume size and range of the interaction as well as suspect perturbative expansions. These issues motivated an investigation into the possibility of directly constructing the effective theory in a periodic volume. This new construction relies heavily on the previous insight about the connection of the effective theory to the wave function boundary constraints. A key result is that the kernel of the effective theory, which captures scattering through the excluded degrees of freedom, is in fact independent of the boundary conditions. It can be fit in the periodic volume context and then transplanted into an infinite volume spherical formulation of the effective theory by a straightforward basis transformation. Finite volume effects are automatically handled in the process. Of immediate interest to the LQCD community is that accurate phase shifts can be easily extracted from the effective theory, avoiding systematic and finite volume errors in existing methods.With a two body effective interaction in hand the last step to a usable bridge is the construction of an A-body interaction in terms of the two body one. The exact form this construction is not settled yet, but one promising structure with leading contributions that can be calculated is explored.The assembly of these three pieces completes the bridge, producing a way to perform nuclear structure calculations that is formally connected to the underlying theory of QCD
Job Management and Task Bundling
High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users’ current workflows or executables
Recommended from our members
Improving Schrödinger Equation Implementations with Gray Code for Adiabatic Quantum Computers
We reformulate the continuous space Schr\"odinger equation in terms of spin
Hamiltonians. For the kinetic energy operator, the critical concept
facilitating the reduction in model complexity is the idea of position
encoding. Binary encoding of position produces a Heisenberg-like model and
yields exponential improvement in space complexity when compared to classical
computing. Encoding with a binary reflected Gray code, and a Hamming distance 2
Gray code yields the additional effect of reducing the spin model down to the
XZ and transverse Ising model respectively. We also identify the bijective
mapping between diagonal unitaries and the Walsh series, producing the mapping
of any real potential to a series of -local Ising models through the fast
Walsh transform. Finally, in a finite volume, we provide some numerical
evidence to support the claim that the total time needed for adiabatic
evolution is protected by the infrared cutoff of the system. As a result,
initial state preparation from a free-field wavefunction to an interacting
system is expected to exhibit polynomial time complexity with volume and
constant scaling with respect to lattice discretization for all encodings. For
the Hamming distance 2 Gray code, the evolution starts with the transverse
Hamiltonian before introducing penalties such that the low lying spectrum
reproduces the energy levels of the Laplacian. The adiabatic evolution of the
penalty Hamiltonian is therefore sensitive to the ultraviolet scale. It is
expected to exhibit polynomial time complexity with lattice discretization, or
exponential time complexity with respect to the number of qubits given a fixed
volume