57 research outputs found

    The t copula with Multiple Parameters of Degrees of Freedom: Bivariate Characteristics and Application to Risk Management

    Full text link
    The t copula is often used in risk management as it allows for modelling tail dependence between risks and it is simple to simulate and calibrate. However, the use of a standard t copula is often criticized due to its restriction of having a single parameter for the degrees of freedom (dof) that may limit its capability to model the tail dependence structure in a multivariate case. To overcome this problem, grouped t copula was proposed recently, where risks are grouped a priori in such a way that each group has a standard t copula with its specific dof parameter. In this paper we propose the use of a grouped t copula, where each group consists of one risk factor only, so that a priori grouping is not required. The copula characteristics in the bivariate case are studied. We explain simulation and calibration procedures, including a simulation study on finite sample properties of the maximum likelihood estimators and Kendall's tau approximation. This new copula can be significantly different from the standard t copula in terms of risk measures such as tail dependence, value at risk and expected shortfall. Keywords: grouped t copula, tail dependence, risk management

    Estimating time-to-onset of adverse drug reactions from spontaneous reporting databases.

    Get PDF
    International audienceBACKGROUND: Analyzing time-to-onset of adverse drug reactions from treatment exposure contributes to meeting pharmacovigilance objectives, i.e. identification and prevention. Post-marketing data are available from reporting systems. Times-to-onset from such databases are right-truncated because some patients who were exposed to the drug and who will eventually develop the adverse drug reaction may do it after the time of analysis and thus are not included in the data. Acknowledgment of the developments adapted to right-truncated data is not widespread and these methods have never been used in pharmacovigilance. We assess the use of appropriate methods as well as the consequences of not taking right truncation into account (naĂŻve approach) on parametric maximum likelihood estimation of time-to-onset distribution. METHODS: Both approaches, naĂŻve or taking right truncation into account, were compared with a simulation study. We used twelve scenarios for the exponential distribution and twenty-four for the Weibull and log-logistic distributions. These scenarios are defined by a set of parameters: the parameters of the time-to-onset distribution, the probability of this distribution falling within an observable values interval and the sample size. An application to reported lymphoma after anti TNF-Âż treatment from the French pharmacovigilance is presented. RESULTS: The simulation study shows that the bias and the mean squared error might in some instances be unacceptably large when right truncation is not considered while the truncation-based estimator shows always better and often satisfactory performances and the gap may be large. For the real dataset, the estimated expected time-to-onset leads to a minimum difference of 58 weeks between both approaches, which is not negligible. This difference is obtained for the Weibull model, under which the estimated probability of this distribution falling within an observable values interval is not far from 1. CONCLUSIONS: It is necessary to take right truncation into account for estimating time-to-onset of adverse drug reactions from spontaneous reporting databases

    Multivariate Prediction of Total Water Storage Changes Over West Africa from Multi-Satellite Data

    Get PDF
    West African countries have been exposed to changes in rainfall patterns over the last decades, including a significant negative trend. This causes adverse effects on water resources of the region, for instance, reduced freshwater availability. Assessing and predicting large-scale total water storage (TWS) variations are necessary for West Africa, due to its environmental, social, and economical impacts. Hydrological models, however, may perform poorly over West Africa due to data scarcity. This study describes a new statistical, data-driven approach for predicting West African TWS changes from (past) gravity data obtained from the gravity recovery and climate experiment (GRACE), and (concurrent) rainfall data from the tropical rainfall measuring mission (TRMM) and sea surface temperature (SST) data over the Atlantic, Pacific, and Indian Oceans. The proposed method, therefore, capitalizes on the availability of remotely sensed observations for predicting monthly TWS, a quantity which is hard to observe in the field but important for measuring regional energy balance, as well as for agricultural, and water resource management.Major teleconnections within these data sets were identified using independent component analysis and linked via low-degree autoregressive models to build a predictive framework. After a learning phase of 72 months, our approach predicted TWS from rainfall and SST data alone that fitted to the observed GRACE-TWS better than that from a global hydrological model. Our results indicated a fit of 79 % and 67 % for the first-year prediction of the two dominant annual and inter-annual modes of TWS variations. This fit reduces to 62 % and 57 % for the second year of projection. The proposed approach, therefore, represents strong potential to predict the TWS over West Africa up to 2 years. It also has the potential to bridge the present GRACE data gaps of 1 month about each 162days as well as a—hopefully—limited gap between GRACE and the GRACE follow-on mission over West Africa. The method presented could also be used to generate a near real-time GRACE forecast over the regions that exhibit strong teleconnections

    IGHV gene mutational status and 17p deletion are independent molecular predictors in a comprehensive clinical-biological prognostic model for overall survival prediction in chronic lymphocytic leukemia

    Get PDF
    Prognostic index for survival estimation by clinical-demographic variables were previously proposed in chronic lymphocytic leukemia (CLL) patients. Our objective was to test in a large retrospective cohort of CLL patients the prognostic power of biological and clinical-demographic variable in a comprehensive multivariate model. A new prognostic index was proposed

    Recurrent Signature Patterns in HIV-1 B Clade Envelope Glycoproteins Associated with either Early or Chronic Infections

    Get PDF
    Here we have identified HIV-1 B clade Envelope (Env) amino acid signatures from early in infection that may be favored at transmission, as well as patterns of recurrent mutation in chronic infection that may reflect common pathways of immune evasion. To accomplish this, we compared thousands of sequences derived by single genome amplification from several hundred individuals that were sampled either early in infection or were chronically infected. Samples were divided at the outset into hypothesis-forming and validation sets, and we used phylogenetically corrected statistical strategies to identify signatures, systematically scanning all of Env. Signatures included single amino acids, glycosylation motifs, and multi-site patterns based on functional or structural groupings of amino acids. We identified signatures near the CCR5 co-receptor-binding region, near the CD4 binding site, and in the signal peptide and cytoplasmic domain, which may influence Env expression and processing. Two signatures patterns associated with transmission were particularly interesting. The first was the most statistically robust signature, located in position 12 in the signal peptide. The second was the loss of an N-linked glycosylation site at positions 413–415; the presence of this site has been recently found to be associated with escape from potent and broad neutralizing antibodies, consistent with enabling a common pathway for immune escape during chronic infection. Its recurrent loss in early infection suggests it may impact fitness at the time of transmission or during early viral expansion. The signature patterns we identified implicate Env expression levels in selection at viral transmission or in early expansion, and suggest that immune evasion patterns that recur in many individuals during chronic infection when antibodies are present can be selected against when the infection is being established prior to the adaptive immune response

    BUGS in the Analysis of Biodiversity Experiments: Species Richness and Composition Are of Similar Importance for Grassland Productivity

    Get PDF
    The idea that species diversity can influence ecosystem functioning has been controversial and its importance relative to compositional effects hotly debated. Unfortunately, assessing the relative importance of different explanatory variables in complex linear models is not simple. In this paper we assess the relative importance of species richness and species composition in a multilevel model analysis of net aboveground biomass production in grassland biodiversity experiments by estimating variance components for all explanatory variables. We compare the variance components using a recently introduced graphical Bayesian ANOVA. We show that while the use of test statistics and the R2 gives contradictory assessments, the variance components analysis reveals that species richness and composition are of roughly similar importance for primary productivity in grassland biodiversity experiments

    Search for Standard Model Higgs Boson Production in Association with a W Boson using a Neural Network

    Get PDF
    Submitted to Phys. Rev. DWe present a search for standard model Higgs boson production in association with a W boson in proton-antiproton collisions at a center of mass energy of 1.96 TeV. The search employs data collected with the CDF II detector that correspond to an integrated luminosity of approximately 1.9 inverse fb. We select events consistent with a signature of a single charged lepton, missing transverse energy, and two jets. Jets corresponding to bottom quarks are identified with a secondary vertex tagging method, a jet probability tagging method, and a neural network filter. We use kinematic information in an artificial neural network to improve discrimination between signal and background compared to previous analyses. The observed number of events and the neural network output distributions are consistent with the standard model background expectations, and we set 95% confidence level upper limits on the production cross section times branching fraction ranging from 1.2 to 1.1 pb or 7.5 to 102 times the standard model expectation for Higgs boson masses from 110 to $150 GeV/c^2, respectively.We present a search for standard model Higgs boson production in association with a W boson in proton-antiproton collisions (pp̅ →W±H→ℓΜbb̅ ) at a center of mass energy of 1.96 TeV. The search employs data collected with the CDF II detector that correspond to an integrated luminosity of approximately 1.9  fb-1. We select events consistent with a signature of a single charged lepton (e±/Ό±), missing transverse energy, and two jets. Jets corresponding to bottom quarks are identified with a secondary vertex tagging method, a jet probability tagging method, and a neural network filter. We use kinematic information in an artificial neural network to improve discrimination between signal and background compared to previous analyses. The observed number of events and the neural network output distributions are consistent with the standard model background expectations, and we set 95% confidence level upper limits on the production cross section times branching fraction ranging from 1.2 to 1.1 pb or 7.5 to 102 times the standard model expectation for Higgs boson masses from 110 to 150  GeV/c2, respectively.Peer reviewe

    Observation of exclusive charmonium production and gamma+gamma to mu+mu- in p+pbar collisions at sqrt{s} = 1.96 TeV

    Get PDF
    7 pages, 3 figures, 1 table. Version accepted for Phys.Rev.Lett. Phys.Rev.Lett. (to be published)We have observed the reactions p+pbar --> p+X+pbar, with X being a centrally produced J/psi, psi(2S) or chi_c0, and gamma+gamma --> mu+mu-, in proton- antiproton collisions at sqrt{s} = 1.96 TeV using the Run II Collider Detector at Fermilab. The event signature requires two oppositely charged muons, each with pseudorapidity |eta| mu+mu-. Events with a J/psi and an associated photon candidate are consistent with exclusive chi_c0 production through double pomeron exchange. The exclusive vector meson production is as expected for elastic photo- production, gamma+p --> J/psi(psi(2S)) + p, which is observed here for the first time in hadron-hadron collisions. The cross sections ds/dy(y=0) for p + pbar --> p + X + pbar with X = J/psi, psi(2S) orchi_c0 are 3.92+/-0.62 nb, 0.53+/-0.14 nb, and 75+/-14 nb respectively. The cross section for the continuum, with |eta(mu+/-)|In CDF we have observed the reactions p+p̅ →p+X+p̅ , with X being a centrally produced J/ψ, ψ(2S), or χc0, and ÎłÎłâ†’ÎŒ+ÎŒ- in pp̅ collisions at √s=1.96  TeV. The event signature requires two oppositely charged central muons, and either no other particles or one additional photon detected. Exclusive vector meson production is as expected for elastic photoproduction, Îł+p→J/ψ(ψ(2S))+p, observed here for the first time in hadron-hadron collisions. We also observe exclusive χc0→J/ψ+Îł. The cross sections dσ/dy|y=0 for J/ψ, ψ(2S), and χc0 are 3.92±0.25(stat)±0.52(syst)  nb, 0.53±0.09(stat)±0.10(syst)  nb, and 76±10(stat)±10(syst)  nb, respectively, and the continuum is consistent with QED. We put an upper limit on the cross section for Odderon exchange in exclusive J/ψ production.Peer reviewe

    Search for the Production of Narrow tb Resonances in 1.9 fb-1 of ppbar Collisions at sqrt(s) = 1.96 TeV

    Get PDF
    We present new limits on resonant tb production in proton-antiproton collisions at 1.96 TeV, using 1.9 fb^-1 of data recorded with the CDF II detector at the Fermilab Tevatron. We reconstruct a candidate mass in events with a lepton, neutrino candidate, and two or three jets, and search for anomalous tb production as modeled by W'->tb. We set a new limit on a right-handed W' with standard model-like coupling, excluding any mass below 800 GeV at 95% C.L. The cross-section for any narrow, resonant tb production between 750 and 950 GeV is found to be less than 0.28 pb at 95% C.L. We also present an exclusion of the W' coupling strength versus W' mass over the range 300 to 950 GeV.We present new limits on resonant tb̅ production in pp̅ collisions at √s=1.96  TeV, using 1.9  fb-1 of data recorded with the CDF II detector at the Fermilab Tevatron. We reconstruct a candidate tb̅ mass in events with a lepton, neutrino candidate, and two or three jets, and search for anomalous tb̅ production as modeled by Wâ€Č→tb̅ . We set a new limit on a right-handed Wâ€Č with standard model-like coupling, excluding any mass below 800  GeV/c2 at 95% C.L. The cross section for any narrow, resonant tb̅ production between 750 and 950  GeV/c2 is found to be less than 0.28 pb at 95% C.L. We also present an exclusion of the Wâ€Č coupling strength versus Wâ€Č mass over the range 300–950  GeV/c2.Peer reviewe

    Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era

    Get PDF
    Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike
    • 

    corecore