215 research outputs found

    Using Machine Learning for Model Physics: an Overview

    Full text link
    In the overview, a generic mathematical object (mapping) is introduced, and its relation to model physics parameterization is explained. Machine learning (ML) tools that can be used to emulate and/or approximate mappings are introduced. Applications of ML to emulate existing parameterizations, to develop new parameterizations, to ensure physical constraints, and control the accuracy of developed applications are described. Some ML approaches that allow developers to go beyond the standard parameterization paradigm are discussed.Comment: 50 pages, 3 figures, 1 tabl

    Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations

    Get PDF
    Climate projections continue to be marred by large uncertainties, which originate in processes that need to be parameterized, such as clouds, convection, and ecosystems. But rapid progress is now within reach. New computational tools and methods from data assimilation and machine learning make it possible to integrate global observations and local high-resolution simulations in an Earth system model (ESM) that systematically learns from both. Here we propose a blueprint for such an ESM. We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection, through matching low-order statistics between ESMs, observations, and high-resolution simulations. We illustrate learning algorithms for ESMs with a simple dynamical system that shares characteristics of the climate system; and we discuss the opportunities the proposed framework presents and the challenges that remain to realize it.Comment: 32 pages, 3 figure

    Statistical methods and machine learning in weather and climate modeling

    Get PDF

    Assessing the Potential of Deep Learning for Emulating Cloud Superparameterization in Climate Models with Real-Geography Boundary Conditions

    Full text link
    We explore the potential of feed-forward deep neural networks (DNNs) for emulating cloud superparameterization in realistic geography, using offline fits to data from the Super Parameterized Community Atmospheric Model. To identify the network architecture of greatest skill, we formally optimize hyperparameters using ~250 trials. Our DNN explains over 70 percent of the temporal variance at the 15-minute sampling scale throughout the mid-to-upper troposphere. Autocorrelation timescale analysis compared against DNN skill suggests the less good fit in the tropical, marine boundary layer is driven by neural network difficulty emulating fast, stochastic signals in convection. However, spectral analysis in the temporal domain indicates skillful emulation of signals on diurnal to synoptic scales. A close look at the diurnal cycle reveals correct emulation of land-sea contrasts and vertical structure in the heating and moistening fields, but some distortion of precipitation. Sensitivity tests targeting precipitation skill reveal complementary effects of adding positive constraints vs. hyperparameter tuning, motivating the use of both in the future. A first attempt to force an offline land model with DNN emulated atmospheric fields produces reassuring results further supporting neural network emulation viability in real-geography settings. Overall, the fit skill is competitive with recent attempts by sophisticated Residual and Convolutional Neural Network architectures trained on added information, including memory of past states. Our results confirm the parameterizability of superparameterized convection with continents through machine learning and we highlight advantages of casting this problem locally in space and time for accurate emulation and hopefully quick implementation of hybrid climate models.Comment: 32 Pages, 13 Figures, Revised Version Submitted to Journal of Advances in Modeling Earth Systems April 202

    Efficient Climate Simulation via Machine Learning Method

    Full text link
    Hybrid modeling combining data-driven techniques and numerical methods is an emerging and promising research direction for efficient climate simulation. However, previous works lack practical platforms, making developing hybrid modeling a challenging programming problem. Furthermore, the lack of standard data sets and evaluation metrics may hamper researchers from comprehensively comparing various algorithms under a uniform condition. To address these problems, we propose a framework called NeuroClim for hybrid modeling under the real-world scenario, a basic setting to simulate the real climate that we live in. NeuroClim consists of three parts: (1) Platform. We develop a user-friendly platform NeuroGCM for efficiently developing hybrid modeling in climate simulation. (2) Dataset. We provide an open-source dataset for data-driven methods in hybrid modeling. We investigate the characteristics of the data, i.e., heterogeneity and stiffness, which reveals the difficulty of regressing climate simulation data; (3) Metrics. We propose a methodology for quantitatively evaluating hybrid modeling, including the approximation ability of machine learning models and the stability during simulation. We believe that NeuroClim allows researchers to work without high level of climate-related expertise and focus only on machine learning algorithm design, which will accelerate hybrid modeling research in the AI-Climate intersection. The codes and data are released at https://github.com/x-w19/NeuroClim.Comment: Work in progres

    Machine Learning for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz '96 Model

    Full text link
    Stochastic parameterizations account for uncertainty in the representation of unresolved sub-grid processes by sampling from the distribution of possible sub-grid forcings. Some existing stochastic parameterizations utilize data-driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and sub-grid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate timescales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both timescales, and the networks closely reproduce the spatio-temporal correlations and regimes of the Lorenz '96 system. We also find that in general those models which produce skillful forecasts are also associated with the best climate simulations.Comment: Submitted to Journal of Advances in Modeling Earth Systems (JAMES
    corecore