162 research outputs found
Последовательное статистическое принятие решений в задачах анализа потоков данных
In the problems of data flows analysis, the problems of statistical decision making on parameters of observed data flows are important. For their solution it is proposed to use sequential statistical decision rules. The rules are constructed for three models of observation flows: sequence of independent homogeneous observations; sequence of observations forming a time series with a trend; sequence of dependent observations forming a homogeneous Markov chain. For each case the situation is considered, where the model describes the observed stochastic data with a distortion. "Outliers" ("contamination") are used as the admissible distortions that adequately describe the majority of situations appear in practice. For such situations the families of sequential decision rules are proposed, and robust decision rules are constructed that allow to reduce influence of distortion to the efficiency characteristics. The results of computer experiments are given to illustrate the constructed decision rules.В задачах анализа потоков данных актуальны проблемы статистического принятия решений о параметрах наблюдаемых потоков. Для их решения в работе предлагается использовать последовательные статистические решающие правила. Такие правила построены в статье для трех моделей потоков наблюдений: последовательности независимых однородных наблюдений; последовательности наблюдений, образующих временной ряд с трендом; последовательности зависимых наблюдений, образующих однородную цепь Маркова. Для каждого случая рассмотрена также ситуация, когда модель описывает наблюдаемые стохастические данные с искажениями. В качестве допустимых искажений используются «выбросы» («засорения»), которые адекватно описывают наиболее часто встречающиеся на практике ситуации. Предложены семейства последовательных решающих правил, в рамках которых строятся робастные решающие правила, позволяющие снизить влияние искажений на характеристики эффективности. Для иллюстрации преимуществ построенных решающих правил приводятся результаты компьютерных экспериментов
Novel biomaterials: plasma-enabled nanostructures and functions
Material processing techniques utilizing low-temperature plasmas as the main process tool feature many unique capabilities for the fabrication of various nanostructured materials. As compared with the neutral-gas based techniques and methods, the plasma-based approaches offer higher levels of energy and flux controllability, often leading to higher quality of the fabricated nanomaterials and sometimes to the synthesis of the hierarchical materials with interesting properties. Among others, nanoscale biomaterials attract significant attention due to their special properties towards the biological materials (proteins, enzymes), living cells and tissues. This review briefly examines various approaches based on the use of low-temperature plasma environments to fabricate nanoscale biomaterials exhibiting high biological activity, biological inertness for drug delivery system, and other features of the biomaterials make them highly attractive. In particular, we briefly discuss the plasma-assisted fabrication of gold and silicon nanoparticles for bio-applications; carbon nanoparticles for bioimaging and cancer therapy; carbon nanotube-based platforms for enzyme production and bacteria growth control, and other applications of low-temperature plasmas in the production of biologically-active materials
Universal behavior of extreme value statistics for selected observables of dynamical systems
The main results of the extreme value theory developed for the investigation
of the observables of dynamical systems rely, up to now, on the Gnedenko
approach. In this framework, extremes are basically identified with the block
maxima of the time series of the chosen observable, in the limit of infinitely
long blocks. It has been proved that, assuming suitable mixing conditions for
the underlying dynamical systems, the extremes of a specific class of
observables are distributed according to the so called Generalized Extreme
Value (GEV) distribution. Direct calculations show that in the case of
quasi-periodic dynamics the block maxima are not distributed according to the
GEV distribution. In this paper we show that, in order to obtain a universal
behaviour of the extremes, the requirement of a mixing dynamics can be relaxed
if the Pareto approach is used, based upon considering the exceedances over a
given threshold. Requiring that the invariant measure locally scales with a
well defined exponent - the local dimension -, we show that the limiting
distribution for the exceedances of the observables previously studied with the
Gnedenko approach is a Generalized Pareto distribution where the parameters
depends only on the local dimensions and the value of the threshold. This
result allows to extend the extreme value theory for dynamical systems to the
case of regular motions. We also provide connections with the results obtained
with the Gnedenko approach. In order to provide further support to our
findings, we present the results of numerical experiments carried out
considering the well-known Chirikov standard map.Comment: 7 pages, 1 figur
Development of New Ensemble Methods Based on the Performance Skills of Regional Climate Models over South Korea
In this paper, the prediction skills of five ensemble methods for temperature and precipitation are discussed by considering 20 yr of simulation results (from 1989 to 2008) for four regional climate models (RCMs) driven by NCEP-Department of Energy and ECMWF Interim Re-Analysis (ERA-Interim) boundary conditions. The simulation domain is the Coordinated Regional Downscaling Experiment (CORDEX) for East Asia. and the number of grid points is 197 x 233 with a 50-km horizontal resolution. Three new performance-based ensemble averaging (PEA) methods are developed in this study using 1) bias, root-mean-square errors (RMSEs) and absolute correlation (PEA_BRC). RMSE and absolute correlation (PEA RAC), and RMSE and original correlation (PEA_ROC). The other two ensemble methods are equal-weighted averaging (EWA) and multivariate linear regression (Mul_Reg). To derive the weighting coefficients and cross validate the prediction skills of the five ensemble methods. the authors considered 15-yr and 5-yr data, respectively, from the 20-yr simulation data. Among the five ensemble methods, the Mul_Reg (EWA) method shows the best (worst) skill during the training period. The PEA_RAC and PEA_ROC methods show skills that are similar to those of Mul_Reg during the training period. However, the skills and stabilities of Mul_Reg were drastically reduced when this method was applied to the prediction period. But, the skills and stabilities of PEA_RAC were only slightly reduced in this case. As a result. PEA RAC shows the best skill, irrespective of the seasons and variables, during the prediction period. This result confirms that the new ensemble method developed in this study. PEA_RAC. can be used for the prediction of regional climate.open7
Very extreme seasonal precipitation in the NARCCAP ensemble: model performance and projections
Seasonal extreme daily precipitation is analyzed in the ensemble of NARCAPP regional climate models. Significant variation in these models' abilities to reproduce observed precipitation extremes over the contiguous United States is found. Model performance metrics are introduced to characterize overall biases, seasonality, spatial extent and the shape of the precipitation distribution. Comparison of the models to gridded observations that include an elevation correction is found to be better than to gridded observations without this correction. A complicated model weighting scheme based on model performance in simulating observations is found to cause significant improvements in ensemble mean skill only if some of the models are poorly performing outliers. The effect of lateral boundary conditions are explored by comparing the integrations driven by reanalysis to those driven by global climate models. Projected mid-century future changes in seasonal precipitation means and extremes are presented and discussions of the sources of uncertainty and the mechanisms causing these changes are presented. © 2012 The Author(s)
Recommended from our members
A verification framework for interannual-to-decadal predictions experiments
Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty
How to create an operational multi-model of seasonal forecasts?
Seasonal forecasts of variables like near-surface temperature or precipitation are becoming increasingly important for a wide range of stakeholders. Due to the many possibilities of recalibrating, combining, and verifying ensemble forecasts, there are ambiguities of which methods are most suitable. To address this we compare approaches how to process and verify multi-model seasonal forecasts based on a scientific assessment performed within the framework of the EU Copernicus Climate Change Service (C3S) Quality Assurance for Multi-model Seasonal Forecast Products (QA4Seas) contract C3S 51 lot 3. Our results underpin the importance of processing raw ensemble forecasts differently depending on the final forecast product needed. While ensemble forecasts benefit a lot from bias correction using climate conserving recalibration, this is not the case for the intrinsically bias adjusted multi-category probability forecasts. The same applies for multi-model combination. In this paper, we apply simple, but effective, approaches for multi-model combination of both forecast formats. Further, based on existing literature we recommend to use proper scoring rules like a sample version of the continuous ranked probability score and the ranked probability score for the verification of ensemble forecasts and multi-category probability forecasts, respectively. For a detailed global visualization of calibration as well as bias and dispersion errors, using the Chi-square decomposition of rank histograms proved to be appropriate for the analysis performed within QA4Seas.The research leading to these results is part of the Copernicus Climate Change Service (C3S) (Framework Agreement number C3S_51_Lot3_BSC), a program being implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Commission. Francisco Doblas-Reyes acknowledges the support by the H2020 EUCP project (GA 776613) and the MINECO-funded CLINSA project (CGL2017-85791-R)
Recommended from our members
Decadal predictions with the HiGEM high resolution global coupled climate model: description and basic evaluation
This paper describes the development and basic evaluation of decadal predictions produced using the HiGEM coupled climate model. HiGEM is a higher resolution version of the HadGEM1 Met Office Unified Model. The horizontal resolution in HiGEM has been increased to 1.25◦ × 0.83◦ in longitude and latitude for the atmosphere, and 1/3◦ × 1/3◦ globally for the ocean. The HiGEM decadal predictions are initialised using an anomaly assimilation scheme that relaxes anomalies of ocean temperature and salinity to observed anomalies. 10 year hindcasts are produced for 10 start dates (1960, 1965,..., 2000, 2005).
To determine the relative contributions to prediction skill from initial conditions and external forcing, the HiGEM decadal predictions are compared to uninitialised HiGEM transient experiments. The HiGEM decadal predictions have substantial skill for predictions of annual mean surface air temperature and 100 m upper ocean temperature.
For lead times up to 10 years, anomaly correlations (ACC) over large areas of the North Atlantic Ocean, the Western Pacific Ocean and the Indian Ocean exceed values of 0.6. Initialisation of the HiGEM decadal predictions significantly increases skill over regions of the Atlantic Ocean,the Maritime Continent and regions of the subtropical North and South Pacific Ocean. In particular, HiGEM produces skillful predictions of the North Atlantic subpolar gyre for up to 4 years lead time (with ACC > 0.7), which are significantly larger than the uninitialised HiGEM transient experiments
- …