457 research outputs found
Geochemical comparison of K-T boundaries from the Northern and Southern Hemispheres
Closely spaced (cm-scale) traverses through the K-T boundary at Stevns Klint (Denmark), Woodside Creek (New Zealand) and a new Southern Hemisphere site at Richards Bay (South Africa) were subjected to trace element and isotopic (C, O, Sr) investigation. Intercomparison between these data-sets, and correlation with the broad K-T database available in the literature, indicate that the chemistry of the boundary clays is not globally constant. Variations are more common than similarities, both of absolute concentrations, and interelement ratios. For example, the chondrite normalized platinum-group elements (PGE) patterns of Stevns Klint are not like those of Woodside Creek, with the Pt/Os ratios showing the biggest variation. These differences in PGE patterns are difficult to explain by secondary alteration of a layer that was originally chemically homogeneous, especially for elements of such dubious crustal mobility as Os and Ir. The data also show that enhanced PGE concentrations, with similar trends to those of the boundary layers, occur in the Cretaceous sediments below the actual boundary at Stevns Klint and all three the New Zealand localities. This confirms the observations of others that the geochemistry of the boundary layers apparently does not record a unique component. It is suggested that terrestrial processes, eg. an extended period of Late Cretaceous volcanism can offer a satisfactory explanation for the features of the K-T geochemical anomaly. Such models would probably be more consistent with the observed stepwise, or gradual, palaeontological changes across this boundary, than the instant catastrophe predicated by the impact theory
Cytomegalovirus (CMV) Disease Despite Weekly Preemptive CMV Strategy for Recipients of Solid Organ and Hematopoietic Stem Cell Transplantation
BACKGROUND:
Transplant recipients presenting with cytomegalovirus (CMV) disease at the time of diagnosis of CMV DNAemia pose a challenge to a preemptive CMV management strategy. However, the rate and risk factors of such failure remain uncertain.
METHODS:
Solid organ transplantation (SOT) and hematopoietic stem cell transplantation (HSCT) recipients with a first episode of CMV polymerase chain reaction (PCR) DNAemia within the first year posttransplantation were evaluated (n = 335). Patient records were reviewed for presence of CMV disease at the time of CMV DNAemia diagnosis. The distribution and prevalence of CMV disease were estimated, and the odds ratio (OR) of CMV disease was modeled using logistic regression.
RESULTS:
The prevalence of CMV disease increased for both SOT and HSCT with increasing diagnostic CMV PCR load and with screening intervals >14 days. The only independent risk factor in multivariate analysis was increasing CMV DNAemia load of the diagnostic CMV PCR (OR = 6.16; 95% confidence interval, 2.09–18.11). Among recipients receiving weekly screening (n = 147), 16 (10.8%) had CMV disease at the time of diagnosis of CMV DNAemia (median DNAemia load 628 IU/mL; interquartile range, 432–1274); 93.8% of these cases were HSCT and lung transplant recipients.
CONCLUSIONS:
Despite application of weekly screening intervals, HSCT and lung transplant recipients in particular presented with CMV disease at the time of diagnosis of CMV DNAemia. Additional research to improve the management of patients at risk of presenting with CMV disease at low levels of CMV DNAemia and despite weekly screening is warranted
Local linear density estimation for filtered survival data, with bias correction
A class of local linear kernel density estimators based on weighted least-squares kernel estimation is considered within the framework of Aalen's multiplicative intensity model. This model includes the filtered data model that, in turn, allows for truncation and/or censoring in addition to accommodating unusual patterns of exposure as well as occurrence. It is shown that the local linear estimators corresponding to all different weightings have the same pointwise asymptotic properties. However, the weighting previously used in the literature in the i.i.d. case is seen to be far from optimal when it comes to exposure robustness, and a simple alternative weighting is to be preferred. Indeed, this weighting has, effectively, to be well chosen in a 'pilot' estimator of the survival function as well as in the main estimator itself. We also investigate multiplicative and additive bias-correction methods within our framework. The multiplicative bias-correction method proves to be the best in a simulation study comparing the performance of the considered estimators. An example concerning old-age mortality demonstrates the importance of the improvements provided
Risk Factors for Failure of Primary (Val)ganciclovir Prophylaxis Against Cytomegalovirus Infection and Disease in Solid Organ Transplant Recipients
Background: Rates and risk factors for cytomegalovirus (CMV) prophylaxis breakthrough and discontinuation were investigated, given uncertainty regarding optimal dosing for CMV primary (val)ganciclovir prophylaxis after solid organ transplantation (SOT). Methods: Recipients transplanted from 2012 to 2016 and initiated on primary prophylaxis were followed until 90 days post-transplantation. A (val)ganciclovir prophylaxis score for each patient per day was calculated during the follow-up time (FUT; score of 100 corresponding to manufacturers' recommended dose for a given estimated glomerular filtration rate [eGFR]). Cox models were used to estimate hazard ratios (HRs), adjusted for relevant risk factors. Results: Of 585 SOTs (311 kidney, 117 liver, 106 lung, 51 heart) included, 38/585 (6.5%) experienced prophylaxis breakthrough and 35/585 (6.0%) discontinued prophylaxis for other reasons. CMV IgG donor+/receipient- mismatch (adjusted HR [aHR], 5.37; 95% confidence interval [CI], 2.63 to 10.98; P < 0.001) and increasing % FUT with a prophylaxis score <90 (aHR, 1.16; 95% CI, 1.04 to 1.29; P = .01 per 10% longer FUT w/ score <90) were associated with an increased risk of breakthrough. Lung recipients were at a significantly increased risk of premature prophylaxis discontinuation (aHR, 20.2 vs kidney; 95% CI, 3.34 to 121.9; P = .001), mainly due to liver or myelotoxicity. Conclusions: Recipients of eGFR-adjusted prophylaxis doses below those recommended by manufacturers were at an increased risk of prophylaxis breakthrough, emphasizing the importance of accurate dose adjustment according to the latest eGFR and the need for novel, less toxic agents
Recommended from our members
Smoothing survival densities in practice
Many nonparametric smoothing procedures consider independent identically distributed stochastic variables. There are also many important nonparametric smoothing applications where the data is more complicated. Survival data or filtered data, defined as following Aalen’s multiplicative hazard model and aggregated versions of this model, are considered. Aalen’s model based on counting process theory allows multiple left truncations and multiple right censoring to be present in the data. This type of filtering is omnipresent in biostatistical and demographical applications, where people can join a study, leave the study and perhaps join the study again. The estimation methodology is based on a recent class of local linear density estimators. A new stable bandwidth-selector is developed for these estimators. A data application to aggregated national mortality data is provided, where immigrations to and from the country correspond to respectively left truncation and right censoring. The aggregated mortality data study illustrates that the new practical density estimators provide an important extra element in the visual toolbox for understanding survival data
Recommended from our members
In-sample forecasting: A brief review and new algorithms
Statistical methods often distinguish between in-sample and out-of-sample approaches. In particular this is the case when time is involved. Then often time series methods are proposed that extrapolate past patterns into the future via complicated recursion formulas. Standard statistical inference is on the other hand concerned with estimating parameters within the given sample. This review paper is about a statistical methodology, where all parameters are estimated in-sample while producing a forecast out-of-sample without recursion or extrapolation. A new super-simulation algorithm ensures a faster implementation of the simplest and perhaps most important version of in-sample forecasting
Recommended from our members
Improving automobile insurance ratemaking using telematics: incorporating mileage and driver behaviour data
We show how data collected from a GPS device can be incorporated in motor insurance ratemaking . The calculation of premium rates based upon driver behaviour represents an opportunity for the insurance sector . Our approach is based on count data regression models for frequency, where exposure is driven by the distance travelled and additional paramete rs that capture characteristics of automobile usage and which may affect claiming behaviour . We propose implement ing a classical frequency model that is updated with telemetrics information. We illustrate the method using real data from usage - based insurance policies. Results show that not only the distance travelled by the driver, but also driver habits, significantly influence the expected number of accidents and, hence, the cost of insurance coverage . This paper provides a methodology including a transition pricing transferring knowledge and experience that the company already had before the telematics data arrived to the new world including telematics information incorporated in motor insurance ratemaking . The calculation of premium rates based upon driver behaviour represents an opportunity for the insurance sector. Our approach is based on count data regression models for frequency, where exposure is driven by the distance travelled and additional parameters that capture characteristics of automobile usage and which may affect claiming behaviour. We propose implementing a classical frequency model that is updated with telemetrics information. We illustrate the method using real data from usage - based insurance policies. Results show that not only the distance travelled by the driver, but also driver habits, significantly influence the expected number of accidents and, hence, the cost of insurance coverage . This paper provides a methodology including a transition pricing transferring knowledge and experience that the company already had before the telematics data arrived to the new world including telematics information
- …