664 research outputs found
Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations
Climate projections continue to be marred by large uncertainties, which
originate in processes that need to be parameterized, such as clouds,
convection, and ecosystems. But rapid progress is now within reach. New
computational tools and methods from data assimilation and machine learning
make it possible to integrate global observations and local high-resolution
simulations in an Earth system model (ESM) that systematically learns from
both. Here we propose a blueprint for such an ESM. We outline how
parameterization schemes can learn from global observations and targeted
high-resolution simulations, for example, of clouds and convection, through
matching low-order statistics between ESMs, observations, and high-resolution
simulations. We illustrate learning algorithms for ESMs with a simple dynamical
system that shares characteristics of the climate system; and we discuss the
opportunities the proposed framework presents and the challenges that remain to
realize it.Comment: 32 pages, 3 figure
Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations
International audienceInverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6 − 19.3 PBq with an estimated standard deviation range of 15 − 20% depending on the method and the data sets. The "blind" time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data
A variational framework for flow optimization using semi-norm constraints
When considering a general system of equations describing the space-time
evolution (flow) of one or several variables, the problem of the optimization
over a finite period of time of a measure of the state variable at the final
time is a problem of great interest in many fields. Methods already exist in
order to solve this kind of optimization problem, but sometimes fail when the
constraint bounding the state vector at the initial time is not a norm, meaning
that some part of the state vector remains unbounded and might cause the
optimization procedure to diverge. In order to regularize this problem, we
propose a general method which extends the existing optimization framework in a
self-consistent manner. We first derive this framework extension, and then
apply it to a problem of interest. Our demonstration problem considers the
transient stability properties of a one-dimensional (in space) averaged
turbulent model with a space- and time-dependent model "turbulent viscosity".
We believe this work has a lot of potential applications in the fluid
dynamics domain for problems in which we want to control the influence of
separate components of the state vector in the optimization process.Comment: 30 page
The Value of Seasonal Productivity Forecasting in Biodiesel Plans
Crop productivity is commonly assumed as a deterministic function when developing
agricultural plans. Actual data prove however that, even for the same soil at the same location, crop
productivity can be better interpreted as a random variable due to the meteorological conditions of the
specific year. For the production of biodiesel, crops are easily substitutable and the farmer can chose
every year between various alternatives. Without information on the seasonal meteorology, the
farmers select the crop to cultivate mainly on the basis of the expected productivity. However,
changes in the meteorological situation may result in a reduction in crop profitability. As a result,
a crop, that on average is less interesting, may become the best choice in a specific year. Given that
seasonal forecasts based on long range climatic variables, such as ENSO, are becoming available,
the paper examines their effectiveness in biodiesel production plans, with reference to an area in Mato Grosso, Brazil. We formulate and solve a mathematical programming problem to determine the most efficient crop plan under different scenarios: (i) no information about the seasonal meteorology, (ii) perfect information and (iii) meteorological forecasts with different precision. This allows us to
quantitatively evaluate how important the availability of seasonal productivity forecasting might be and
shows that even a rough seasonal forecast, if systematically applied, may improve the average
production and reduce the risk of traditional agricultural decisions
Beyond Gaussian Statistical Modeling in Geophysical Data Assimilation
International audienceThis review discusses recent advances in geophysical data assimilation beyond Gaussian statistical modeling, in the fields of meteorology, oceanography, as well as atmospheric chemistry. The non-Gaussian features are stressed rather than the nonlinearity of the dynamical models, although both aspects are entangled. Ideas recently proposed to deal with these non-Gaussian issues, in order to improve the state or parameter estimation, are emphasized. The general Bayesian solution to the estimation problem and the techniques to solve it are first presented, as well as the obstacles that hinder their use in high-dimensional and complex systems. Approximations to the Bayesian solution relying on Gaussian, or on second-order moment closure, have been wholly adopted in geophysical data assimilation (e.g., Kalman filters and quadratic variational solutions). Yet, nonlinear and non-Gaussian effects remain. They essentially originate in the nonlinear models and in the non-Gaussian priors. How these effects are handled within algorithms based on Gaussian assumptions is then described. Statistical tools that can diagnose them and measure deviations from Gaussianity are recalled. The following advanced techniques that seek to handle the estimation problem beyond Gaussianity are reviewed: maximum entropy filter, Gaussian anamorphosis, non-Gaussian priors, particle filter with an ensemble Kalman filter as a proposal distribution, maximum entropy on the mean, or strictly Bayesian inferences for large linear models, etc. Several ideas are illustrated with recent or original examples that possess some features of high-dimensional systems. Many of the new approaches are well understood only in special cases and have difficulties that remain to be circumvented. Some of the suggested approaches are quite promising, and sometimes already successful for moderately large though specific geophysical applications. Hints are given as to where progress might come from
- …