5,837 research outputs found
Research and Education in Computational Science and Engineering
Over the past two decades the field of computational science and engineering
(CSE) has penetrated both basic and applied research in academia, industry, and
laboratories to advance discovery, optimize systems, support decision-makers,
and educate the scientific and engineering workforce. Informed by centuries of
theory and experiment, CSE performs computational experiments to answer
questions that neither theory nor experiment alone is equipped to answer. CSE
provides scientists and engineers of all persuasions with algorithmic
inventions and software systems that transcend disciplines and scales. Carried
on a wave of digital technology, CSE brings the power of parallelism to bear on
troves of data. Mathematics-based advanced computing has become a prevalent
means of discovery and innovation in essentially all areas of science,
engineering, technology, and society; and the CSE community is at the core of
this transformation. However, a combination of disruptive
developments---including the architectural complexity of extreme-scale
computing, the data revolution that engulfs the planet, and the specialization
required to follow the applications to new frontiers---is redefining the scope
and reach of the CSE endeavor. This report describes the rapid expansion of CSE
and the challenges to sustaining its bold advances. The report also presents
strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
Why Simpler Computer Simulation Models Can Be Epistemically Better for Informing Decisions
For computer simulation models to usefully inform climate risk management, uncertainties in model projections must be explored and characterized. Because doing so requires running the model many ti..
Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations
Climate projections continue to be marred by large uncertainties, which
originate in processes that need to be parameterized, such as clouds,
convection, and ecosystems. But rapid progress is now within reach. New
computational tools and methods from data assimilation and machine learning
make it possible to integrate global observations and local high-resolution
simulations in an Earth system model (ESM) that systematically learns from
both. Here we propose a blueprint for such an ESM. We outline how
parameterization schemes can learn from global observations and targeted
high-resolution simulations, for example, of clouds and convection, through
matching low-order statistics between ESMs, observations, and high-resolution
simulations. We illustrate learning algorithms for ESMs with a simple dynamical
system that shares characteristics of the climate system; and we discuss the
opportunities the proposed framework presents and the challenges that remain to
realize it.Comment: 32 pages, 3 figure
Simulation Intelligence: Towards a New Generation of Scientific Methods
The original "Seven Motifs" set forth a roadmap of essential methods for the
field of scientific computing, where a motif is an algorithmic method that
captures a pattern of computation and data movement. We present the "Nine
Motifs of Simulation Intelligence", a roadmap for the development and
integration of the essential algorithms necessary for a merger of scientific
computing, scientific simulation, and artificial intelligence. We call this
merger simulation intelligence (SI), for short. We argue the motifs of
simulation intelligence are interconnected and interdependent, much like the
components within the layers of an operating system. Using this metaphor, we
explore the nature of each layer of the simulation intelligence operating
system stack (SI-stack) and the motifs therein: (1) Multi-physics and
multi-scale modeling; (2) Surrogate modeling and emulation; (3)
Simulation-based inference; (4) Causal modeling and inference; (5) Agent-based
modeling; (6) Probabilistic programming; (7) Differentiable programming; (8)
Open-ended optimization; (9) Machine programming. We believe coordinated
efforts between motifs offers immense opportunity to accelerate scientific
discovery, from solving inverse problems in synthetic biology and climate
science, to directing nuclear energy experiments and predicting emergent
behavior in socioeconomic settings. We elaborate on each layer of the SI-stack,
detailing the state-of-art methods, presenting examples to highlight challenges
and opportunities, and advocating for specific ways to advance the motifs and
the synergies from their combinations. Advancing and integrating these
technologies can enable a robust and efficient hypothesis-simulation-analysis
type of scientific method, which we introduce with several use-cases for
human-machine teaming and automated science
Advancing measurements and representations of subsurface heterogeneity and dynamic processes: towards 4D hydrogeology
Essentially all hydrogeological processes are strongly influenced by the subsurface spatial heterogeneity and the temporal variation of environmental conditions, hydraulic properties, and solute concentrations. This spatial and temporal variability generally leads to effective behaviors and emerging phenomena that cannot be predicted from conventional approaches based on homogeneous assumptions and models. However, it is not always clear when, why, how, and at what scale the 4D (3D + time) nature of the subsurface needs to be considered in hydrogeological monitoring, modeling, and applications. In this paper, we discuss the interest and potential for the monitoring and characterization of spatial and temporal variability, including 4D imaging, in a series of hydrogeological processes: (1) groundwater fluxes, (2) solute transport and reaction, (3) vadose zone dynamics, and (4) surface–subsurface water interactions. We first identify the main challenges related to the coupling of spatial and temporal fluctuations for these processes. We then highlight recent innovations that have led to significant breakthroughs in high-resolution space–time imaging and modeling the characterization, monitoring, and modeling of these spatial and temporal fluctuations. We finally propose a classification of processes and applications at different scales according to their need and potential for high-resolution space–time imaging. We thus advocate a more systematic characterization of the dynamic and 3D nature of the subsurface for a series of critical processes and emerging applications. This calls for the validation of 4D imaging techniques at highly instrumented observatories and the harmonization of open databases to share hydrogeological data sets in their 4D components
Deep learning in remote sensing: a review
Standing at the paradigm shift towards data-intensive science, machine
learning techniques are becoming increasingly important. In particular, as a
major breakthrough in the field, deep learning has proven as an extremely
powerful tool in many fields. Shall we embrace deep learning as the key to all?
Or, should we resist a 'black-box' solution? There are controversial opinions
in the remote sensing community. In this article, we analyze the challenges of
using deep learning for remote sensing data analysis, review the recent
advances, and provide resources to make deep learning in remote sensing
ridiculously simple to start with. More importantly, we advocate remote sensing
scientists to bring their expertise into deep learning, and use it as an
implicit general model to tackle unprecedented large-scale influential
challenges, such as climate change and urbanization.Comment: Accepted for publication IEEE Geoscience and Remote Sensing Magazin
Accelerating Bayesian microseismic event location with deep learning
We present a series of new open-source deep-learning algorithms to accelerate Bayesian full-waveform point source inversion of microseismic
events. Inferring the joint posterior probability distribution of moment tensor components and source location is key for rigorous uncertainty
quantification. However, the inference process requires forward modelling of microseismic traces for each set of parameters explored by the sampling
algorithm, which makes the inference very computationally intensive. In this paper we focus on accelerating this process by training deep-learning
models to learn the mapping between source location and seismic traces for a given 3D heterogeneous velocity model and a fixed isotropic moment
tensor for the sources. These trained emulators replace the expensive solution of the elastic wave equation in the inference process.
We compare our results with a previous study that used emulators based on Gaussian processes to invert microseismic events. For fairness of
comparison, we train our emulators on the same microseismic traces and using the same geophysical setting. We show that all of our models provide
more accurate predictions, ∼ 100 times faster predictions than the method based on Gaussian processes, and a (105) speed-up
factor over a pseudo-spectral method for waveform generation. For example, a 2 s long synthetic trace can be generated in ∼ 10 ms on a
common laptop processor, instead of ∼ 1 h using a pseudo-spectral method on a high-profile graphics processing unit card. We also
show that our inference results are in excellent agreement with those obtained from traditional location methods based on travel time estimates. The
speed, accuracy, and scalability of our open-source deep-learning models pave the way for extensions of these emulators to generic source mechanisms
and application to joint Bayesian inversion of moment tensor components and source location using full waveforms.</p
- …