11 research outputs found
Towards Handling Uncertainty in Prognostic Scenarios: Advanced Learning from the Past
In this report we introduce the paradigm of learning from the past which is realized in a controlled prognostic context. It is a data-driven exploratory approach to assessing the limits to credibility of any expectations about the system’s future behavior which are based on a time series of a historical observations of the analyzed system. This horizon of the credible expectations is derived as the length of explainable outreach of the data, that is, the spatio-temporal extent which, in lieu of the knowledge contained in the historical observations, we are justified in believing contains the system’s future observations. Explainable outreach is of practical interest to stakeholders since it allows them to assess the credibility of scenarios produced by models of the analyzed system. It also indicates the scale of measures required to overcome the system’s inertia. In this report we propose a method of learning in a controlled prognostic context which is based on a polynomial regression technique. A polynomial regression model is used to understand the system’s dynamics, revealed by the sample of historical observations, while the explainable outreach is constructed around the extrapolated regression function. The proposed learning method was tested on various sets of synthetic data in order to identify its strengths and weaknesses, and formulate guidelines for its practical application. We also demonstrate how it can be used in context of earth system sciences by using it to derive the explainable outreach of historical anthropogenic CO2 emissions and atmospheric CO2 concentrations. We conclude that the most robust method of building the explainable outreach is based on linear regression. However, the explainable outreach of the analyzed datasets (representing credible expectations based on extrapolation of the linear trend) is rather short
Towards Handling Uncertainty in Prognostic Scenarios: Advanced Learning from the Past
Das Forschungsprogramm „Earth System Sciences (ESS)“, ein Programm des Bundesministeriums für Wissenschaft, Forschung und Wirtschaft (BMWFW), durchgeführt von der ÖAW, hat die Erforschung des Systems Erde zum Ziel. Im Rahmen von Ausschreibungen werden wissenschaftliche Forschungsprojekte gefördert, die dem neusten Stand der Wissenschaft entsprechen. Das Programm ESS sieht es als seine Aufgabe, Lücken in der österreichischen Förderungslandschaft zu schließen. Dies bezieht sich etwa auf interdisziplinäre Projekte, Projekte zur Langzeitforschung sowie auf Projekte, die auf derzeit noch gering beforschte Bereiche fokussiert sind und denen wissenschaftlich
Learning in greenhouse gas emission inventories in terms of uncertainty improvement over time
This paper addresses the problem of learning in greenhouse gas (GHG) emission inventories understood as reductions in uncertainty, i.e., inaccuracy and/or imprecision, over time. We analyze the National Inventory Reports (NIRs) submitted annually to the United Nations Framework Convention on Climate Change. Each NIR contains data on the GHG emissions in a given country for a given year as well as revisions of past years’ estimates. We arrange the revisions, i.e., estimates of historical emissions published in consecutive NIRs into a table, so that each column contains revised estimates of emissions for the same year, reflecting different realizations of uncertainty. We propose two variants of a two-step procedure to investigate the changes of uncertainty over time. In step 1, we assess changes in inaccuracy, which we consider constant within each revision, by either detrending the revisions using the smoothing spline fitted to the most recent revision (method 1) or by taking differences between the most recent revision and the previous ones (method 2). Step 2 estimates the imprecision by analyzing the columns of the data table. We assess learning by detecting and modeling a decreasing trend in inaccuracy and/or imprecision. We analyze carbon dioxide (CO2) emission inventories for the European Union (EU-15) as a whole and its individual member countries. Our findings indicate that although there is still room for improvement, continued efforts to improve accounting methodology lead to a reduction of uncertainty of emission estimates reported in NIRs, which is of key importance for monitoring the realization of countries’ emission reduction commitments
Taking advantage of the UNFCCC Kyoto Policy Process: What can we learn about learning?
Learning is difficult to anticipate when it happen instantaneously, e.g. in the context of innovations [2]. However, even if learning is anticipated to happen continuously, it is difficult to grasp, e.g. when it occurs outside well-defined lab conditions, because adequate monitoring had not been put in place.
Our study is retrospective. It focuses on the emissions of greenhouse gases (GHGs)that had been reported by countries (Parties) under the Kyoto Protocol (KP) to the United Nations Framework on Climate Change (UNFCCC). Discussions range widely on (i) whether the KP is considered a failure [6] or a success [5] ; and (ii) whether international climate policy should transit from a centralized model of governance to a 'hybrid' decentralized approach that combines country-level mitigation pledges with common principles for accounting and monitoring [1] .
Emissions of GHGs - in the following we refer to CO2 emissions from burning fossil fuels at country level, particularly in the case of Austria - provide a perfect means to study learning in a globally relevant context. We are not aware of a similar data treasure of global relevance. Our mode of grasping learning is novel, i.e. it may have been referred to in general but, to the best of our knowledge, had not been quantifed so far. (That is, we consider the KP a success story potentially and advocate for the hybrid decentralized approach.)
Learning requires 'measuring' differences or deviations. Here we follow Marland et al. [3] who discuss this issue in the context of emissions accounting:
'Many of the countries and organizations that make estimates of CO2 emissions provide annual updates in which they add another year of data to the time series and revise the estimates for earlier years. Revisions may reflect revised or more complete energy data and ... more complete and detailed understanding of the emissions processes and emissions coefficients. In short, we expect revisions to reflect learning and a convergence toward more complete and accurate estimates.'
The United Nations Framework Convention on Climate Change (UNFCCC)requires exactly this to be done. Each year UNFCCC signatory countries are obliged to provide an annual inventory of emissions (and removals) of specified GHGs from five sectors (energy; industrial processes and product use; agriculture; land use, land use change and forestry; and waste) and revisit the emissions (and removals) for all previous years, back to the country specified base years (or periods). These data are made available by means of a database [4].
The time series of revised emission estimates reflect learning, but they are 'contaminated' by (i) structural change (e.g., when a coal-power plant is substituted by a gas-power plant); (ii) changes in consumption; and, rare but possible, (iii)methodological changes in surveying emission related activities. De-trending time series of revised emission estimates allows this contamination to be isolated by country, for which we provide three approaches: (I) parametric approach employing polynomial trend; (II) non-parametric approach employing smoothing splines; and (III) approach in which the most recent estimate is used as trend. That is, after de-trending for each year we are left with a set of revisions that reflect 'pure'(uncontaminated) learning which, is expected to be independent of the year under consideration (i.e., identical from year to year).
However, we are confronted with two non-negligible problems (P): (P.1) the problem of small numbers - the remaining differences in emissions are small (before and after de-trending); and (P.2) the problem of non-monotonic learning - our knowledge of emission-generating activities and emission factors may not become more accurate from revision to revision
Multivariate kernel density estimation with a parametric support
We consider kernel density estimation in the multivariate case, focusing on the use of some elements of parametric estimation. We present a two-step method, based on a modification of the EM algorithm and the generalized kernel density estimator, and compare this method with a couple of well known multivariate kernel density estimation methods
Estimation of Means in a Bivariate Discrete-Time Process
We consider a discrete-time non-stationary stochastic process being a sum of two other processes. Given a data matrix of its realizations, we aim to estimate and then analyze the mean values of the component processes as functions of time. Both existence and uniqueness of a solution to this problem are investigated. An algorithm for estimating the mean values is proposed. The method is applied to analyze the uncertainty in National Inventory Reports (NIR) on greenhouse gases (GHG) emission, provided annually by cosignatories to the UNFCCC and its Kyoto Protocol. Each report contains data on GHG emission from a given year and revisions of past data, recalculated due to improved knowledge and methodology. However, it has to also deal with uncertainty, present whether GHG emissions are quantified. The method proposed can be used as an attempt to improve inaccuracy and imprecision in processing the rough data in time. The results are presented for Poland, and a few selected EU-15 countries