33 research outputs found

    Second-order refined peaks-over-threshold modelling for heavy-tailed distributions

    Full text link
    Modelling excesses over a high threshold using the Pareto or generalized Pareto distribution (PD/GPD) is the most popular approach in extreme value statistics. This method typically requires high thresholds in order for the (G)PD to fit well and in such a case applies only to a small upper fraction of the data. The extension of the (G)PD proposed in this paper is able to describe the excess distribution for lower thresholds in case of heavy tailed distributions. This yields a statistical model that can be fitted to a larger portion of the data. Moreover, estimates of tail parameters display stability for a larger range of thresholds. Our findings are supported by asymptotic results, simulations and a case study.Comment: to appear in the Journal of Statistical Planning and Inferenc

    An overview of Portfolio Insurances: CPPI and CPDO

    Get PDF
    Derivative instruments attempt to protect a portfolio against failure events. Constant proportion portfolio insurance (CPPI) and constant proportion debt obligations (CPDO) strategies are recent innovations and have only been adopted in the credit market for the last couple of years. Since their introduction, CPPI strategies have been popular because they provide protection while at the same time they offer high yields. CPDOs were only introduced into the market in 2006 and can be considered as a variation of the CPPI with as main difference the fact that CPDOs do not provide principal protection. Both CPPI and CPDO strategies take investment positions in a risk-free bond and a risky portfolio (often one or more credit default swaps). At each step, the portfolio is rebalanced and the level of risk taken will depend on the distance between the current value of the portfolio and the necessary amount needed to full all the future obligations. In a first step the functioning of both products is studied in depth concluding with drawing some conclusions on their risky-ness.JRC.G.9-Econometrics and statistical support to antifrau

    Unbiased Tail Estimation by an Extension of the General Pareto Distribution

    Get PDF
    The generalized Pareto distribution (GPD) is probably the most popular model for inference on the tail of a distribution. The peaks-over-threshold methodology postulates the GPD as the natural model for excesses over a high threshold. However, for the GPD to fit such excesses well, the threshold should often be rather large, thereby restricting the model to only a small upper fraction of the data. In case of heavy-tailed distributions, we propose an extension of the GPD with a single parameter, motivated by a second-order refinement of the underlying Pareto-type model. Not only can the extended model be fitted to a larger fraction of the data, but in addition is the resulting maximum likelihood for the tail index asymptotically unbiased. In practice, sample paths of the new tail index estimator as a function of the chosen threshold exhibit much larger regions of stability around the true value. We apply the method to daily log-returns of the euro-UK pound exchange rate. Some simulation results are presented as well.JRC.G.9-Econometrics and statistical support to antifrau

    Change in the DGS Level of Coverage Due to the 2008 Financial Crisis: First Basic Impact Evaluation

    Get PDF
    Scope of this report is the assessment of the impact of increasing the DGS level of coverage for banks¿ deposits (from current levels up to 50K¿ or possibly 100K¿) as a consequnce of the 2008 financial crisis.JRC.G.9-Econometrics and statistical support to antifrau

    How resilient are the European regions? Evidence from the societal response to the 2008 financial crisis

    Get PDF
    This report proposes a new approach for measuring regional resilience that goes beyond the assessment of traditional economic dimensions. It defines resilience as the societal ability to preserve and generate well-being in the presence of shocks and persistent structural changes in a sustainable manner, without hindering the well-being of future generations. The empirical exercise concentrates on the 2008 financial and economic crisis and the subsequent overall response of EU regions to the economic shock. We implement a three-step methodology: (i) select an extensive list of economic and non-economic variables that span the entire production process of societal well-being; (ii) compute regional resilience indicators based on the joint dynamic response of these variables to the crisis; (iii) identify those pre-crisis characteristics that differentiate resilient regions from the non-resilient ones. Our analysis reveals substantial heterogeneity in resilience across the European regions. It confirms the importance of expanding the measurement strategy to a broader list of subjective and objective well-being measures (like social inclusion, social capital, and quality of life). We show that observed resilience performance is highly dependent on the time horizon: resilience rankings of European regions are markedly different in the short and long run. The analysis of the recovery time provides additional information on the strength and weaknesses of regions, and it is largely dependent on the specific dimensions (variables) considered. Finally, our results highlight that certain country-level and regional characteristics, such as private sector credit flows and the gender employment gap, are strong predictors of resilient regional behaviour after the crisis.JRC.B.1-Finance and Econom

    Application of the Virtual Cell Based Assay for Simulation of in vitro Chemical fate following Acute Exposure

    Get PDF
    In order to reliably assess the risk of adverse systemic effects of chemicals by using in vitro methods, there is a need to simulate their absorption, distribution, metabolism, and excretion (ADME) in vivo to determine the target organ bioavailable concentration, and to compare this predicted internal concentration with an effective internal concentration. The effective concentration derived from in vitro toxicity studies should ideally take into account the fate of chemicals in the in vitro test system, since there can be significant differences between the applied nominal concentration and the in vitro bioavailable concentration. Whereas PBK models have been developed to simulate ADME properties in vivo, the Virtual Cell Based Assay (VCBA) has been developed to simulate in vitro fate. In this project, the VCBA model in R code, was applied to better interpret previously obtained in vitro acute toxicity data and study how they can be compared to results from acute toxicity in vivo. For 178 chemicals previously tested in vitro with the 3T3 BALB/c cell line using the Neutral Red Uptake cytotoxicity assay, physicochemical parameters were retrieved and curated. Of these chemicals, 83 were run in the VCBA to simulate a 96-well microplate set up with 5% serum supplementation, and their no effect concentration (NEC) and killing rate (Kr) optimized against the experimental data. Analyses of results of partitioning of the chemicals show a strong relation with their lipophilicity, expressed here as the logarithm of the octanol/water partitioning coefficient, with highly lipophilic chemicals binding mostly to medium lipid. Among the chemicals analysed, only benzene and xylene were modelled to evaporate by more than 10 %, and these were also the chemicals with highest degradation rates during the 48 hours assay. Chemical degradation is dependent not only on the air and water degradation rates but also on the extent of binding of the chemical. Due to the strong binding of some chemicals to medium lipids and proteins we analysed the impact of different serum supplementations (0%, 5% and 10%) on the chemical dissolved concentrations. As expected, for the more lipophilic chemicals, different serum levels result in different dissolved concentrations, with lipid and protein binding reducing chemical loss by evaporation. Still the lack of saturation modelling might mislead the 0 % supplementation since the lipids coming solely from cells exudates are able to sequester chemical to a large extent, eg. after 48 hours, 63% (1.2E-5 M) of dimethyldioctadecylammonium chloride was bound to lipid from the cells. Although highly lipophilic chemicals have a very small bioavailable fraction, cellular uptake rate is also dependent on logKow, which compensates for this lack of bioavailability to some extent. Based on the relevance of lipophilicity on in vitro chemical bioavailability, we have developed an alert system based on logKow, creating four classes of chemicals for the experimental condition with 10% serum supplementation: logKow 5- 10 (A), logKow <5 (B), logKow <2.5 (C), and logKow <2 (D). New chemicals from Classes A and B, which will in the future be tested in vitro, were run first on the VCBA, without considering toxicity (NEC and Kr set to 0). VCBA simulations indicated that these chemicals are more than 50% bound to medium proteins, lipids and plastic. Therefore, for chemicals with logKow falling in these classes, special care should be taken when extrapolating the obtained in vitro toxic concentrations to in vivo relevant doses. A comparison of the VCBA-predicted dissolved concentrations corresponding to nominal IC50 values with the available rat oral LD50 values did not improve the previously obtained correlations. This is probably because other in vivo kinetic processes play an important role but were not considered in this in vitro-in vivo extrapolation. The comparison of the VCBA predicted IC50 dissolved concentrations with the available rat oral LD50 values, did not improve the previously obtained correlations. Nevertheless, other in vivo kinetic processes that are not modelled may play an important role. They should be considered in the in vitro-in vivo extrapolations. A local sensitivity analysis showed the relative low impact of Molar Volume and Molecular Diffusion Volume on the final dissolved concentration, supporting the use of approximated values obtained through the herein created QSARs. The logkow and Henry Law Constant showed, as expected, a high impact in partitioning. Killing rate was shown to also have a relative low impact in the final chemical concentration, indicating that although its optimization is important, finding the Kr that leads to the absolute best correlation between experimental and predicted concentration-viability curves, is not imperative. The VCBA can be applied to virtually any chemical as long as the physicochemical data (for the fate model) and the experimental toxicity data (that include cell growth/death) are available. However, being such a generic model, several assumptions had to be made: i) no distinction of chemical classes (inorganic, polar organic chemicals), ii) no consideration of metabolism, iii) saturation kinetics and iv) external in vitro conditions. The advantages of having a generic model are that the VCBA can fit several experimental set ups and should be used in an exploratory manner, to help refinement of experimental conditions. The herein obtained VCBA results should be double checked experimentally the partition with a set of chemical compounds to better understand to what extent VCBA represents chemicals of different properties. In future developments, it would be important to reduce the uncertainties of the model such as binding-saturation and consider inclusion of other endpoints such as metabolic activity.JRC.F.3-Chemicals Safety and Alternative Method

    Evaluation of the availability and applicability of computational approaches in the safety assessment of nanomaterials: Final report of the Nanocomput project

    Get PDF
    This is the final report of the Nanocomput project, the main aims of which were to review the current status of computational methods that are potentially useful for predicting the properties of engineered nanomaterials, and to assess their applicability in order to provide advice on the use of these approaches for the purposes of the REACH regulation. Since computational methods cover a broad range of models and tools, emphasis was placed on Quantitative Structure-Property Relationship (QSPR) and Quantitative Structure-Activity Relationship (QSAR) models, and their potential role in predicting NM properties. In addition, the status of a diverse array of compartment-based mathematical models was assessed. These models comprised toxicokinetic (TK), toxicodynamic (TD), in vitro and in vivo dosimetry, and environmental fate models. Finally, based on systematic reviews of the scientific literature, as well as the outputs of the EU-funded research projects, recommendations for further research and development were also made. The Nanocomput project was carried out by the European Commission’s Joint Research Centre (JRC) for the Directorate-General (DG) for Internal Market, Industry, Entrepreneurship and SMEs (DG GROW) under the terms of an Administrative Arrangement between JRC and DG GROW. The project lasted 39 months, from January 2014 to March 2017, and was supported by a steering group with representatives from DG GROW, DG Environment and the European Chemicals Agency (ECHA).JRC.F.3-Chemicals Safety and Alternative Method

    Seurat-1: HepaRG, repeated and single dose exposure for Mitochondrial Health and LipidTox

    Get PDF
    The purpose of this report is to describe the experimental procedure used in order to detect changes in mitochondrial membrane potential and lipid accumulation following exposure of HepaRG cells to various chemicals both by repeated exposure as single exposure to chemicals. This procedure was created for the SEURAT-1 Project runs 15 to 18 and was developed by using live cell imaging.JRC.I.5-Systems Toxicolog

    A high throughput imaging database of toxicological effects of nanomaterials tested on HepaRG cells

    Get PDF
    The large amount of existing nanomaterials demands rapid and reliable methods for testing their potential toxicological effect on human health, preferably by means of relevant in vitro techniques in order to reduce testing on animals. Combining high throughput workflows with automated high content imaging techniques allows deriving much more information from cell-based assays than the typical readouts (i.e. one measurement per well) with optical plate-readers. We present here a dataset including data based on a maximum of 14 different read outs (including viable cell count, cell membrane permeability, apoptotic cell death, mitochondrial membrane potential and steatosis) of the human hepatoma HepaRG cell line treated with a large set of nanomaterials, coatings and supernatants at different concentrations. The database, given its size, can be utilized in the development of in silico hazard assessment and prediction tools or can be combined with toxicity results from other in vitro test systems.peer-reviewe

    Digital Transformation in Transport, Construction, Energy, Government and Public Administration

    Get PDF
    This report provides an analysis of digital transformation (DT) in a selection of policy areas covering transport, construction, energy, and digital government and public administration. DT refers in the report to the profound changes that are taking place in all sectors of the economy and society as a result of the uptake and integration of digital technologies in every aspect of human life. Digital technologies are having increasing impacts on the way of living, of working, on communication, and on social interaction of a growing share of the population. DT is expected to be a strategic policy area for a number of years to come and there is an urgent need to be able to identify and address current and future challenges for the economy and society, evaluating impact and identifying areas requiring policy intervention. Because of the very wide range of interrelated domains to be considered when analysing DT, a multidisciplinary approach was adopted to produce this report, involving experts from different domains. For each of the four sectors that are covered, the report presents an overview of DT, DT enablers and barriers, its economic and social impacts, and concludes with the way forward for policy and future research.JRC.B.6-Digital Econom
    corecore