5,426 research outputs found
Using Machine Learning for Model Physics: an Overview
In the overview, a generic mathematical object (mapping) is introduced, and
its relation to model physics parameterization is explained. Machine learning
(ML) tools that can be used to emulate and/or approximate mappings are
introduced. Applications of ML to emulate existing parameterizations, to
develop new parameterizations, to ensure physical constraints, and control the
accuracy of developed applications are described. Some ML approaches that allow
developers to go beyond the standard parameterization paradigm are discussed.Comment: 50 pages, 3 figures, 1 tabl
Recommended from our members
Operational solar forecasting for the real-time market
Despite the significant progress made in solar forecasting over the last decade, most of the proposed models cannot be readily used by independent system operators (ISOs). This article proposes an operational solar forecasting algorithm that is closely aligned with the real-time market (RTM) forecasting requirements of the California ISO (CAISO). The algorithm first uses the North American Mesoscale (NAM) forecast system to generate hourly forecasts for a 5-h period that are issued 12 h before the actual operating hour, satisfying the lead-time requirement. Subsequently, the world's fastest similarity search algorithm is adopted to downscale the hourly forecasts generated by NAM to a 15-min resolution, satisfying the forecast-resolution requirement. The 5-h-ahead forecasts are repeated every hour, following the actual rolling update rate of CAISO. Both deterministic and probabilistic forecasts generated using the proposed algorithm are empirically evaluated over a period of 2 years at 7 locations in 5 climate zones
On the role of pre and post-processing in environmental data mining
The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed
APPLICATION OF NEURAL NETWORKS TO EMULATION OF RADIATION PARAMETERIZATIONS IN GENERAL CIRCULATION MODELS
A novel approach based on using neural network (NN) techniques for approximation of physical components of complex environmental systems has been applied and further developed in this dissertation. A new type of a numerical model, a complex hybrid environmental model, based on a combination of deterministic and statistical learning model components, has been explored. Conceptual and practical aspects of developing hybrid models have been formalized as a methodology for applications to climate modeling and numerical weather prediction. The approach uses NN as a machine or statistical learning technique to develop highly accurate and fast emulations for model physics components/parameterizations. The NN emulations of the most time consuming model physics components, short and long wave radiation (LWR and SWR) parameterizations have been combined with the remaining deterministic components of a general circulation model (GCM) to constitute a hybrid GCM (HGCM). The parallel GCM and HGCM simulations produce very similar results but HGCM is significantly faster. The high accuracy, which is of a paramount importance for the approach, and a speed-up of model calculations when using NN emulations, open the opportunity for model improvement. It includes using extended NN ensembles and/or more frequent calculations of full model radiation resulting in an improvement of radiation-cloud interaction, a better consistency with model dynamics and other model physics components.
First, the approach was successfully applied to a moderate resolution (T42L26) uncoupled NCAR Community Atmospheric Model driven by climatological SST for a decadal climate simulation mode. Then it has been further developed and subsequently implemented into a coupled GCM, the NCEP Climate Forecast System with significantly higher resolution (T126L64) and time dependent CO2 and tested for decadal climate simulations, seasonal prediction, and short- to medium term forecasts.
The developed highly accurate NN emulations of radiation parameterizations are on average one to two orders of magnitude faster than the original radiation parameterizations. The NN approach was extended by introduction of NN ensembles and a compound parameterization with quality control of larger errors.
Applicability of other statistical learning techniques, such as approximate nearest neighbor approximation and random trees, to emulation of model physics has also been explore
Applying machine learning to improve simulations of a chaotic dynamical system using empirical error correction
Dynamical weather and climate prediction models underpin many studies of the
Earth system and hold the promise of being able to make robust projections of
future climate change based on physical laws. However, simulations from these
models still show many differences compared with observations. Machine learning
has been applied to solve certain prediction problems with great success, and
recently it's been proposed that this could replace the role of
physically-derived dynamical weather and climate models to give better quality
simulations. Here, instead, a framework using machine learning together with
physically-derived models is tested, in which it is learnt how to correct the
errors of the latter from timestep to timestep. This maintains the physical
understanding built into the models, whilst allowing performance improvements,
and also requires much simpler algorithms and less training data. This is
tested in the context of simulating the chaotic Lorenz '96 system, and it is
shown that the approach yields models that are stable and that give both
improved skill in initialised predictions and better long-term climate
statistics. Improvements in long-term statistics are smaller than for single
time-step tendencies, however, indicating that it would be valuable to develop
methods that target improvements on longer time scales. Future strategies for
the development of this approach and possible applications to making progress
on important scientific problems are discussed.Comment: 26p, 7 figures To be published in Journal of Advances in Modeling
Earth System
- …