337 research outputs found

    Quantum Simulation of Lattice QCD with Improved Hamiltonians

    Full text link
    Quantum simulations of lattice gauge theories are anticipated to directly probe the real time dynamics of QCD, but scale unfavorably with the required truncation of the gauge fields. Improved Hamiltonians are derived to correct for the effects of gauge field truncations on the SU(3) Kogut-Susskind Hamiltonian. It is shown in 1+1D1+1D that this enables low chromo-electric field truncations to quantitatively reproduce features of the untruncated theory over a range of couplings and quark masses. In 3+1D3+1D, an improved Hamiltonian is derived for lattice QCD with staggered massless fermions. It is shown in the strong coupling limit that the spectrum qualitatively reproduces aspects of two flavor QCD and simulations of a small system are performed on IBM's {\tt Perth} quantum processor

    Multistability and localization in forced cyclic symmetric structures modelled by weakly-coupled Duffing oscillators

    Get PDF
    Many engineering structures are composed of weakly coupled sectors assembled in a cyclic and ideally symmetric configuration, which can be simplified as forced Duffing oscillators. In this paper, we study the emergence of localized states in the weakly nonlinear regime. We show that multiple spatially localized solutions may exist, and the resulting bifurcation diagram strongly resembles the snaking pattern observed in a variety of fields in physics, such as optics and fluid dynamics. Moreover, in the transition from the linear to the nonlinear behaviour isolated branches of solutions are identified. Localization is caused by the hardening effect introduced by the nonlinear stiffness, and occurs at large excitation levels. Contrary to the case of mistuning, the presented localization mechanism is triggered by the nonlinearities and arises in perfectly homogeneous systems

    Different ways of framing event attribution questions: The example of warm and wet winters in the United Kingdom similar to 2015/16

    Get PDF
    This is the final version. Available from the American Meteorological Society via the DOI in this recordAttribution analyses of extreme events estimate changes in the likelihood of their occurrence due to human climatic influences by comparing simulations with and without anthropogenic forcings. Classes of events are commonly considered that only share one or more key characteristics with the observed event. Here we test the sensitivity of attribution assessments to such event definition differences, using the warm and wet winter of 2015/16 in the United Kingdom as a case study. A large number of simulations from coupled models and an atmospheric model are employed. In the most basic case, warm and wet events are defined relative to climatological temperature and rainfall thresholds. Several other classes of events are investigated that, in addition to threshold exceedance, also account for the effect of observed sea surface temperature (SST) anomalies, the circulation flow, or modes of variability present during the reference event. Human influence is estimated to increase the likelihood of warm winters in the United Kingdom by a factor of 3 or more for events occurring under any atmospheric and oceanic conditions, but also for events with a similar circulation or oceanic state to 2015/16. The likelihood of wet winters is found to increase by at least a factor of 1.5 in the general case, but results from the atmospheric model, conditioned on observed SST anomalies, are more uncertain, indicating that decreases in the likelihood are also possible. The robustness of attribution assessments based on atmospheric models is highly dependent on the representation of SSTs without the effect of human influence.Joint BEIS/Defra Met Office Hadley Centre Climate Programm

    On unified crack propagation laws

    Get PDF
    The anomalous propagation of short cracks shows generally exponential fatigue crack growth but the dependence on stress range at high stress levels is not compatible with Paris’ law with exponent . Indeed, some authors have shown that the standard uncracked SN curve is obtained mostly from short crack propagation, assuming that the crack size a increases with the number of cycles N as where h is close to the exponent of the Basquin’s power law SN curve. We therefore propose a general equation for crack growth which for short cracks has the latter form, and for long cracks returns to the Paris’ law. We show generalized SN curves, generalized Kitagawa–Takahashi diagrams, and discuss the application to some experimental data. The problem of short cracks remains however controversial, as we discuss with reference to some examples

    The effect of human land use change in the Hadley Centre attribution system

    Get PDF
    This is the final version. Available on open access from Wiley via the DOI in this recordAtmospheric Science Letters published by John Wiley & Sons Ltd on behalf of the Royal Meteorological Society. We have investigated the effects of land use on past climate change by means of a new 15-member ensemble of the HadGEM3-A-N216 model, usually used for event attribution studies. This ensemble runs from 1960 to 2013, and includes natural external climate forcings with the addition of human land use changes. It supports previously-existing ensembles, either with only natural forcings, or with all forcings (both anthropogenic and natural, including land use changes), in determining the contribution to the change in risk of extreme events made by land use change. We found a significant difference in near-surface air temperature trends over land, attributable to the effects of human land use. The main part of the signal derives from a relative cooling in Arctic regions which closely matches that of deforestation. This cooling appears to spread by polar amplification. A similar pattern of change is seen in latent heat flux trend, but significant rainfall change is almost entirely absent.Department for Business, Energy and Industrial Strategy, Met Office Hadley Centre Climate ProgrammeDepartment for Environment, Food and Rural AffairsEuropean CommissionUK‐China Research & Innovation Partnership Fund, Newton Fun

    State Preparation in the Heisenberg Model through Adiabatic Spiraling

    Full text link
    An adiabatic state preparation technique, called the adiabatic spiral, is proposed for the Heisenberg model. This technique is suitable for implementation on a number of quantum simulation platforms such as Rydberg atoms, trapped ions, or superconducting qubits. Classical simulations of small systems suggest that it can be successfully implemented in the near future. A comparison to Trotterized time evolution is performed and it is shown that the adiabatic spiral is able to outperform Trotterized adiabatics.Comment: 22 pages, 8 figures, published versio

    Preparation for Quantum Simulation of the 1+1D O(3) Non-linear {\sigma}-Model using Cold Atoms

    Full text link
    The 1+1D O(3) non-linear {\sigma}-model is a model system for future quantum lattice simulations of other asymptotically-free theories, such as non-Abelian gauge theories. We find that utilizing dimensional reduction can make efficient use of two-dimensional layouts presently available on cold atom quantum simulators. A new definition of the renormalized coupling is introduced, which is applicable to systems with open boundary conditions and can be measured using analog quantum simulators. Monte Carlo and tensor network calculations are performed to determine the quantum resources required to reproduce perturbative short-distance observables. In particular, we show that a rectangular array of 48 Rydberg atoms with existing quantum hardware capabilities should be able to adiabatically prepare low-energy states of the perturbatively-matched theory. These states can then be used to simulate non-perturbative observables in the continuum limit that lie beyond the reach of classical computers.Comment: 12 pages, 5 figures, 2 tables, published versio

    Scalable Circuits for Preparing Ground States on Digital Quantum Computers: The Schwinger Model Vacuum on 100 Qubits

    Full text link
    The vacuum of the lattice Schwinger model is prepared on up to 100 qubits of IBM's Eagle-processor quantum computers. A new algorithm to prepare the ground state of a gapped translationally-invariant system on a quantum computer is presented, which we call Scalable Circuits ADAPT-VQE (SC-ADAPT-VQE). This algorithm uses the exponential decay of correlations between distant regions of the ground state, together with ADAPT-VQE, to construct quantum circuits for state preparation that can be scaled to arbitrarily large systems. SC-ADAPT-VQE is applied to the Schwinger model, and shown to be systematically improvable, with an accuracy that converges exponentially with circuit depth. Both the structure of the circuits and the deviations of prepared wavefunctions are found to become independent of the number of spatial sites, LL. This allows for a controlled extrapolation of the circuits, determined using small or modest-sized systems, to arbitrarily large LL. The circuits for the Schwinger model are determined on lattices up to L=14L=14 (28 qubits) with the qiskit classical simulator, and subsequently scaled up to prepare the L=50L=50 (100 qubits) vacuum on IBM's 127 superconducting-qubit quantum computers ibm_brisbane and ibm_cusco. After applying an improved error-mitigation technique, which we call Operator Decoherence Renormalization, the chiral condensate and charge-charge correlators obtained from the quantum computers are found to be in good agreement with classical Matrix Product State simulations.Comment: 14 pages + appendices. 16 figures, 12 table

    Simulating Heisenberg Interactions in the Ising Model with Strong Drive Fields

    Full text link
    The time-evolution of an Ising model with large driving fields over discrete time intervals is shown to be reproduced by an effective XXZ-Heisenberg model at leading order in the inverse field strength. For specific orientations of the drive field, the dynamics of the XXX-Heisenberg model is reproduced. These approximate equivalences, valid above a critical driving field strength set by dynamical phase transitions in the Ising model, are expected to enable quantum devices that natively evolve qubits according to the Ising model to simulate more complex systems.Comment: 10 pages, 5 figures, accepted versio

    Longitudinal tear protein changes correlate with ocular chronic gvhd development in allogeneic hematopoietic stem cell transplant patients

    Get PDF
    Ocular graft-versus-host disease (oGVHD) is a manifestation of chronic GVHD, frequently occurring in patients after allogeneic hematopoietic stem cell transplant (HSCT). We analyzed tear protein changes before and after allogeneic HSCT, and correlated their levels with the oGVHD development. This retrospective study included 102 patients, and data were recorded before the conditioning treatment, and after 3 to 6 months postoperatively. Tear protein analysis was performed with the Agilent-2100 Bioanalyzer on individual tears sampled by aspiration. Total protein (TP), Lysozyme-C (LYS-C), Lactoferrin (LACTO), Lipocalin-1 (LIPOC-1), Transferrin (TRANSF), Albumin (ALB), and Zinc-alpha-2-glycoprotein (ZAG-2) levels were retrieved and statistically analyzed. Following HSCT forty-three patients developed oGVHD. TP, LACTO, LYS-C, and ZAG-2 levels significantly decreased post-HSCT as compared to pre HSCT levels. In univariate analysis, TP, LACTO, and ZAG-2 decrease was associated with an increased development of oGVHD (OR = 4.49; 95% CI, 1.9 to 10.5; p < 0.001; OR = 3.08; 95% CI 1.3 to 7.6; p = 0.01; OR = 11.1; 95% CI 2.7 to 46.6; p < 0.001, respectively). TRANSF post-HSCT levels significantly increased (OR 15.7; 95% CI, 4.1 to 52.2; p = 0.0001). No pre-post-HSCT changes were shown in ALB and LIPOC-1 levels. Data suggest that TP content, LACTO, TRANSF, and ZAG-2 pre-post changes might be significant predictors of oGVHD development
    • 

    corecore