71 research outputs found
Recommended from our members
Northern Eurasia Future Initiative (NEFI): facing the challenges and pathways of global change in the 21st century
During the past several decades, the Earth system has changed significantly, especially across Northern Eurasia. Changes in the socio-economic conditions of the larger countries in the region have also resulted in a variety of regional environmental changes that can
have global consequences. The Northern Eurasia Future Initiative (NEFI) has been designed as an essential continuation of the Northern Eurasia Earth Science
Partnership Initiative (NEESPI), which was launched in 2004. NEESPI sought to elucidate all aspects of ongoing environmental change, to inform societies and, thus, to
better prepare societies for future developments. A key principle of NEFI is that these developments must now be secured through science-based strategies co-designed
with regional decision makers to lead their societies to prosperity in the face of environmental and institutional challenges. NEESPI scientific research, data, and
models have created a solid knowledge base to support the NEFI program. This paper presents the NEFI research vision consensus based on that knowledge. It provides the reader with samples of recent accomplishments in regional studies and formulates new NEFI science questions. To address these questions, nine research foci are identified and their selections are briefly justified. These foci include: warming of the Arctic; changing frequency, pattern, and intensity of extreme and inclement environmental conditions; retreat of the cryosphere; changes in terrestrial water cycles; changes in the biosphere; pressures on land-use; changes in infrastructure; societal actions in response to environmental change; and quantification of Northern Eurasia's role in the global Earth system. Powerful feedbacks between the Earth and human systems in Northern Eurasia (e.g., mega-fires, droughts, depletion of the cryosphere essential for water supply, retreat of sea ice) result from past and current human activities (e.g., large scale water withdrawals, land use and governance change) and
potentially restrict or provide new opportunities for future human activities. Therefore, we propose that Integrated Assessment Models are needed as the final stage of global
change assessment. The overarching goal of this NEFI modeling effort will enable evaluation of economic decisions in response to changing environmental conditions and justification of mitigation and adaptation efforts
Understanding Novel Superconductors with Ab Initio Calculations
This chapter gives an overview of the progress in the field of computational
superconductivity.
Following the MgB2 discovery (2001), there has been an impressive
acceleration in the development of methods based on Density Functional Theory
to compute the critical temperature and other physical properties of actual
superconductors from first-principles. State-of-the-art ab-initio methods have
reached predictive accuracy for conventional (phonon-mediated) superconductors,
and substantial progress is being made also for unconventional superconductors.
The aim of this chapter is to give an overview of the existing computational
methods for superconductivity, and present selected examples of material
discoveries that exemplify the main advancements.Comment: 38 pages, 10 figures, Contribution to Springer Handbook of Materials
Modellin
Phase transitions in random circuit sampling.
Undesired coupling to the surrounding environment destroys long-range correlations in quantum processors and hinders coherent evolution in the nominally available computational space. This noise is an outstanding challenge when leveraging the computation power of near-term quantum processors1. It has been shown that benchmarking random circuit sampling with cross-entropy benchmarking can provide an estimate of the effective size of the Hilbert space coherently available2-8. Nevertheless, quantum algorithms' outputs can be trivialized by noise, making them susceptible to classical computation spoofing. Here, by implementing an algorithm for random circuit sampling, we demonstrate experimentally that two phase transitions are observable with cross-entropy benchmarking, which we explain theoretically with a statistical model. The first is a dynamical transition as a function of the number of cycles and is the continuation of the anti-concentration point in the noiseless case. The second is a quantum phase transition controlled by the error per cycle; to identify it analytically and experimentally, we create a weak-link model, which allows us to vary the strength of the noise versus coherent evolution. Furthermore, by presenting a random circuit sampling experiment in the weak-noise phase with 67 qubits at 32 cycles, we demonstrate that the computational cost of our experiment is beyond the capabilities of existing classical supercomputers. Our experimental and theoretical work establishes the existence of transitions to a stable, computationally complex phase that is reachable with current quantum processors
Purification-based quantum error mitigation of pair-correlated electron simulations
An important measure of the development of quantum computing platforms has been the simulation of increasingly complex physical systems. Before fault-tolerant quantum computing, robust error-mitigation strategies were necessary to continue this growth. Here, we validate recently introduced error-mitigation strategies that exploit the expectation that the ideal output of a quantum algorithm would be a pure state. We consider the task of simulating electron systems in the seniority-zero subspace where all electrons are paired with their opposite spin. This affords a computational stepping stone to a fully correlated model. We compare the performance of error mitigations on the basis of doubling quantum resources in time or in space on up to 20 qubits of a superconducting qubit quantum processor. We observe a reduction of error by one to two orders of magnitude below less sophisticated techniques such as postselection. We study how the gain from error mitigation scales with the system size and observe a polynomial suppression of error with increased resources. Extrapolation of our results indicates that substantial hardware improvements will be required for classically intractable variational chemistry simulations
The Space Physics Environment Data Analysis System (SPEDAS)
With the advent of the Heliophysics/Geospace System Observatory (H/GSO), a complement of multi-spacecraft missions and ground-based observatories to study the space environment, data retrieval, analysis, and visualization of space physics data can be daunting. The Space Physics Environment Data Analysis System (SPEDAS), a grass-roots software development platform (www.spedas.org), is now officially supported by NASA Heliophysics as part of its data environment infrastructure. It serves more than a dozen space missions and ground observatories and can integrate the full complement of past and upcoming space physics missions with minimal resources, following clear, simple, and well-proven guidelines. Free, modular and configurable to the needs of individual missions, it works in both command-line (ideal for experienced users) and Graphical User Interface (GUI) mode (reducing the learning curve for first-time users). Both options have “crib-sheets,” user-command sequences in ASCII format that can facilitate record-and-repeat actions, especially for complex operations and plotting. Crib-sheets enhance scientific interactions, as users can move rapidly and accurately from exchanges of technical information on data processing to efficient discussions regarding data interpretation and science. SPEDAS can readily query and ingest all International Solar Terrestrial Physics (ISTP)-compatible products from the Space Physics Data Facility (SPDF), enabling access to a vast collection of historic and current mission data. The planned incorporation of Heliophysics Application Programmer’s Interface (HAPI) standards will facilitate data ingestion from distributed datasets that adhere to these standards. Although SPEDAS is currently Interactive Data Language (IDL)-based (and interfaces to Java-based tools such as Autoplot), efforts are under-way to expand it further to work with python (first as an interface tool and potentially even receiving an under-the-hood replacement). We review the SPEDAS development history, goals, and current implementation. We explain its “modes of use” with examples geared for users and outline its technical implementation and requirements with software developers in mind. We also describe SPEDAS personnel and software management, interfaces with other organizations, resources and support structure available to the community, and future development plans
Mechanical response of double-network gels with dynamic bonds under multi-cycle deformation
A new approach to improving efficiency of gas-lift wells in the conditions of the formation of organic wax deposits in the Dragon field
Computational-experimental studies of fretting corrosion and oscillations of the fuel bundles of the VVER-1000 power reactor
- …
