3,686 research outputs found

    Applying machine learning to improve simulations of a chaotic dynamical system using empirical error correction

    Full text link
    Dynamical weather and climate prediction models underpin many studies of the Earth system and hold the promise of being able to make robust projections of future climate change based on physical laws. However, simulations from these models still show many differences compared with observations. Machine learning has been applied to solve certain prediction problems with great success, and recently it's been proposed that this could replace the role of physically-derived dynamical weather and climate models to give better quality simulations. Here, instead, a framework using machine learning together with physically-derived models is tested, in which it is learnt how to correct the errors of the latter from timestep to timestep. This maintains the physical understanding built into the models, whilst allowing performance improvements, and also requires much simpler algorithms and less training data. This is tested in the context of simulating the chaotic Lorenz '96 system, and it is shown that the approach yields models that are stable and that give both improved skill in initialised predictions and better long-term climate statistics. Improvements in long-term statistics are smaller than for single time-step tendencies, however, indicating that it would be valuable to develop methods that target improvements on longer time scales. Future strategies for the development of this approach and possible applications to making progress on important scientific problems are discussed.Comment: 26p, 7 figures To be published in Journal of Advances in Modeling Earth System

    The benefits of global high-resolution for climate simulation: process-understanding and the enabling of stakeholder decisions at the regional scale

    Get PDF
    A perspective on current and future capabilities in global high-resolution climate simulation for assessing climate risks over next few decades, including advances in process representation and analysis, justifying the emergence of dedicated, coordinated experimental protocols. The timescales of the Paris Climate Agreement indicate urgent action is required on climate policies over the next few decades, in order to avoid the worst risks posed by climate change. On these relatively short timescales the combined effect of climate variability and change are both key drivers of extreme events, with decadal timescales also important for infrastructure planning. Hence, in order to assess climate risk on such timescales, we require climate models to be able to represent key aspects of both internally driven climate variability, as well as the response to changing forcings. In this paper we argue that we now have the modelling capability to address these requirements - specifically with global models having horizontal resolutions considerably enhanced from those typically used in previous IPCC and CMIP exercises. The improved representation of weather and climate processes in such models underpins our enhanced confidence in predictions and projections, as well as providing improved forcing to regional models, which are better able to represent local-scale extremes (such as convective precipitation). We choose the global water cycle as an illustrative example, because it is governed by a chain of processes for which there is growing evidence of the benefits of higher resolution. At the same time it comprises key processes involved in many of the expected future climate extremes (e.g. flooding, drought, tropical and mid-latitude storms)

    Stochastic and Statistical Methods in Climate, Atmosphere, and Ocean Science

    Get PDF
    Introduction The behavior of the atmosphere, oceans, and climate is intrinsically uncertain. The basic physical principles that govern atmospheric and oceanic flows are well known, for example, the Navier-Stokes equations for fluid flow, thermodynamic properties of moist air, and the effects of density stratification and Coriolis force. Notwithstanding, there are major sources of randomness and uncertainty that prevent perfect prediction and complete understanding of these flows. The climate system involves a wide spectrum of space and time scales due to processes occurring on the order of microns and milliseconds such as the formation of cloud and rain droplets to global phenomena involving annual and decadal oscillations such as the EL Nio-Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO) [5]. Moreover, climate records display a spectral variability ranging from 1 cycle per month to 1 cycle per 100, 000 years [23]. The complexity of the climate system stems in large part from the inherent nonlinearities of fluid mechanics and the phase changes of water substances. The atmosphere and oceans are turbulent, nonlinear systems that display chaotic behavior (e.g., [39]). The time evolutions of the same chaotic system starting from two slightly different initial states diverge exponentially fast, so that chaotic systems are marked by limited predictability. Beyond the so-called predictability horizon (on the order of 10 days for the atmosphere), initial state uncertainties (e.g., due to imperfect observations) have grown to the point that straightforward forecasts are no longer useful. Another major source of uncertainty stems from the fact that numerical models for atmospheric and oceanic flows cannot describe all relevant physical processes at once. These models are in essence discretized partial differential equations (PDEs), and the derivation of suitable PDEs (e.g., the so-called primitive equations) from more general ones that are less convenient for computation (e.g., the full Navier-Stokes equations) involves approximations and simplifications that introduce errors in the equations. Furthermore, as a result of spatial discretization of the PDEs, numerical models have finite resolution so that small-scale processes with length scales below the model grid scale are not resolved. These limitations are unavoidable, leading to model error and uncertainty. The uncertainties due to chaotic behavior and unresolved processes motivate the use of stochastic and statistical methods for modeling and understanding climate, atmosphere, and oceans. Models can be augmented with random elements in order to represent time-evolving uncertainties, leading to stochastic models. Weather forecasts and climate predictions are increasingly expressed in probabilistic terms, making explicit the margins of uncertainty inherent to any prediction

    Machine Learning for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz '96 Model

    Full text link
    Stochastic parameterizations account for uncertainty in the representation of unresolved sub-grid processes by sampling from the distribution of possible sub-grid forcings. Some existing stochastic parameterizations utilize data-driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and sub-grid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate timescales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both timescales, and the networks closely reproduce the spatio-temporal correlations and regimes of the Lorenz '96 system. We also find that in general those models which produce skillful forecasts are also associated with the best climate simulations.Comment: Submitted to Journal of Advances in Modeling Earth Systems (JAMES

    Uncertainty partition challenges the predictability of vital details of climate change

    Full text link
    Decision makers and consultants are particularly interested in “detailed” information on future climate to prepare adaptation strategies and adjust design criteria. Projections of future climate at local spatial scales and fine temporal resolutions are subject to the same uncertainties as those at the global scale but the partition among uncertainty sources (emission scenarios, climate models, and internal climate variability) remains largely unquantified. At the local scale, the uncertainty of the mean and extremes of precipitation is shown to be irreducible for mid and end‐of‐century projections because it is almost entirely caused by internal climate variability (stochasticity). Conversely, projected changes in mean air temperature and other meteorological variables can be largely constrained, even at local scales, if more accurate emission scenarios can be developed. The results were obtained by applying a comprehensive stochastic downscaling technique to climate model outputs for three exemplary locations. In contrast with earlier studies, the three sources of uncertainty are considered as dependent and, therefore, non‐additive. The evidence of the predominant role of internal climate variability leaves little room for uncertainty reduction in precipitation projections; however, the inference is not necessarily negative, because the uncertainty of historic observations is almost as large as that for future projections with direct implications for climate change adaptation measures.Key PointsUncertainties of climate change projections at high spatial and temporal resolution are analyzedUncertainty cannot be reduced in precipitation projections and for extremesUncertainty in air temperature can be potentially constrained with refined emission scenariosPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/122428/1/eft2122_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/122428/2/eft2122.pd

    Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations

    Get PDF
    Climate projections continue to be marred by large uncertainties, which originate in processes that need to be parameterized, such as clouds, convection, and ecosystems. But rapid progress is now within reach. New computational tools and methods from data assimilation and machine learning make it possible to integrate global observations and local high-resolution simulations in an Earth system model (ESM) that systematically learns from both. Here we propose a blueprint for such an ESM. We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection, through matching low-order statistics between ESMs, observations, and high-resolution simulations. We illustrate learning algorithms for ESMs with a simple dynamical system that shares characteristics of the climate system; and we discuss the opportunities the proposed framework presents and the challenges that remain to realize it.Comment: 32 pages, 3 figure

    Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    Get PDF
    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scale

    Examining chaotic convection with super-parameterization ensembles

    Get PDF
    2017 Spring.Includes bibliographical references.This study investigates a variety of features present in a new configuration of the Community Atmosphere Model (CAM) variant, SP-CAM 2.0. The new configuration (multiple-parameterization-CAM, MP-CAM) changes the manner in which the super-parameterization (SP) concept represents physical tendency feedbacks to the large-scale by using the mean of 10 independent two-dimensional cloud-permitting model (CPM) curtains in each global model column instead of the conventional single CPM curtain. The climates of the SP and MP configurations are examined to investigate any significant differences caused by the application of convective physical tendencies that are more deterministic in nature, paying particular attention to extreme precipitation events and large-scale weather systems, such as the Madden-Julian Oscillation (MJO). A number of small but significant changes in the mean state climate are uncovered, and it is found that the new formulation degrades MJO performance. Despite these deficiencies, the ensemble of possible realizations of convective states in the MP configuration allows for analysis of uncertainty in the small-scale solution, lending to examination of those weather regimes and physical mechanisms associated with strong, chaotic convection. Methods of quantifying precipitation predictability are explored, and use of the most reliable of these leads to the conclusion that poor precipitation predictability is most directly related to the proximity of the global climate model column state to atmospheric critical points. Secondarily, the predictability is tied to the availability of potential convective energy, the presence of mesoscale convective organization on the CPM grid, and the directive power of the large-scale
    corecore