74 research outputs found
Reduced Vitamin K Status as a Potentially Modifiable Risk Factor of Severe Coronavirus Disease 2019
BACKGROUND: Respiratory failure and thromboembolism are frequent in SARS-CoV-2-infected patients. Vitamin K activates both hepatic coagulation factors and extrahepatic endothelial anticoagulant protein S, required for thrombosis prevention. In times of vitamin K insufficiency, hepatic procoagulant factors are preferentially activated over extrahepatic proteins. Vitamin K also activates matrix Gla protein (MGP), which protects against pulmonary and vascular elastic fiber damage. We hypothesized that vitamin K may be implicated in coronavirus disease 2019 (COVID-19), linking pulmonary and thromboembolic disease. METHODS: 135 hospitalized COVID-19 patients were compared with 184 historical controls. Poor outcome was defined as invasive ventilation and/or death. Inactive vitamin K-dependent MGP (dp-ucMGP) and prothrombin (PIVKA-II) were measured, inversely related to extrahepatic and hepatic vitamin K status, respectively. Desmosine was measured to quantify the rate of elastic fiber degradation. Arterial calcification severity was assessed by computed tomography. RESULTS: Dp-ucMGP was elevated in COVID-19 patients compared to controls (p<0.001), with even higher dp-ucMGP in patients with poor outcomes (p<0.001). PIVKA-II was normal in 82.1% of patients. Dp-ucMGP was correlated with desmosine (p<0.001), and coronary artery (p=0.002) and thoracic aortic (p<0.001) calcification scores. CONCLUSIONS: Dp-ucMGP was severely increased in COVID-19 patients, indicating extrahepatic vitamin K insufficiency, which was related to poor outcome while hepatic procoagulant factor II remained unaffected. These data suggest a mechanism of pneumonia-induced extrahepatic vitamin K depletion leading to accelerated elastic fiber damage and thrombosis in severe COVID-19 due to impaired activation of MGP and endothelial protein S, respectively. A clinical trial could assess whether vitamin K administration improves COVID-19 outcomes
Dysregulated innate and adaptive immune responses discriminate disease severity in COVID-19
The clinical spectrum of COVID-19 varies and the differences in host response characterizing this variation have not been fully elucidated. COVID-19 disease severity correlates with an excessive pro-inflammatory immune response and profound lymphopenia. Inflammatory responses according to disease severity were explored by plasma cytokine measurements and proteomics analysis in 147 COVID-19 patients. Furthermore, peripheral blood mononuclear cell cytokine production assays and whole blood flow cytometry were performed. Results confirm a hyperinflammatory innate immune state, while highlighting hepatocyte growth factor and stem cell factor as potential biomarkers for disease severity. Clustering analysis reveals no specific inflammatory endotypes in COVID-19 patients. Functional assays reveal abrogated adaptive cytokine production (interferon-gamma, interleukin-17 and interleukin-22) and prominent T cell exhaustion in critically ill patients, whereas innate immune responses were intact or hyperresponsive. Collectively, this extensive analysis provides a comprehensive insight into the pathobiology of severe to critical COVID-19 and highlight potential biomarkers of disease severity
Gender differences in the use of cardiovascular interventions in HIV-positive persons; the D:A:D Study
Peer reviewe
Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study
Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
Melioidosis in travelers: An analysis of Dutch melioidosis registry data 1985–2018
Background: Melioidosis, caused by the Gram-negative bacterium Burkholderia pseudomallei, is an opportunistic infection across the tropics. Here, we provide a systematic overview of imported human cases in a non-endemic country over a 25-year period. Methods: All 5
Non-AIDS defining cancers in the D:A:D Study-time trends and predictors of survival : a cohort study
BACKGROUND:Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004-2010, and described subsequent mortality and predictors of these.METHODS:Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient's last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient's death, 1st February 2010 or 6 months after the patient's last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression.RESULTS:Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin's lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004-2010 in this large observational cohort.CONCLUSIONS:The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC
Changes in first-line cART regimens and short-term clinical outcome between 1996 and 2010 in The Netherlands
Contains fulltext :
125527.pdf (publisher's version ) (Open Access)OBJECTIVES: Document progress in HIV-treatment in The Netherlands since 1996 by reviewing changing patterns of cART use and relating those to trends in patients' short-term clinical outcomes between 1996 and 2010. DESIGN AND METHODS: 1996-2010 data from 10,278 patients in the Dutch ATHENA national observational cohort were analysed. The annual number of patients starting a type of regimen was quantified. Trends in the following outcomes were described: i) recovery of 150 CD4 cells/mm(3) within 12 months of starting cART; ii) achieving viral load (VL) suppression </=1,000 copies/ml within 12 months of starting cART; iii) switching from first-line to second-line regimen within three years of starting treatment; and iv) all-cause mortality rate per 100 person-years within three years of starting treatment. RESULTS: Between 1996 and 2010, first-line regimens changed from lamivudine/zidovudine-based or lamivudine/stavudine-based regimens with unboosted-PIs to tenofovir with either emtricitabine or lamivudine with NNRTIs. Mortality rates did not change significantly over time. VL suppression and CD4 recovery improved over time, and the incidence of switching due to virological failure and toxicity more than halved between 1996 and 2010. These effects appear to be related to the use of new regimens rather than improvements in clinical care. CONCLUSION: The use of first-line cART in the Netherlands closely follows changes in guidelines, to the benefit of patients. While there was no significant improvement in mortality, newer drugs with better tolerability and simpler dosing resulted in improved immunological and virological recovery and reduced incidences of switching due to toxicity and virological failure
The release of endotoxin, TNF and IL-6 during the antibiotic treatment of experimental Gram-negative sepsis
To evaluate the role of different antibiotics in the release of endotoxin and the production of tumor necrosis factor-α (TNF) and interleukin 6 (IL-6) during the treatment of experimental Escherichia coli septical peritonitis, we obtained serial blood samples from septic rats treated with placebo, ceftazidime, aztreonam or imipenem. We also studied the effect of taurolidine, given alone or in combination with aztreonam, on the release of endotoxin and IL-6. Despite decreasing levels of viable counts after treatment with ceftazidime, aztreonam or imipenem, levels of free endotoxin increased in all animals. We did not notice any significant differences in the extent of plasma endotoxin release between the different treatment groups. However, we did find significant differences in the IL-6 production between the different treatment groups. After 2 h of treatment, IL-6 levels had increased in all animals with the highest levels in the imipenem treated animals, whereafter IL-6 levels decreased again in the rats treated with imipenem or ceftazidime, while in the rats treated with placebo or aztreonam IL-6 levels further increased. This increase in IL-6 levels was associated with acute mortality. In all antibiotic treated animals TNF levels significantly decreased during therapy. After 2 h of treatment TNF levels were the highest in the imipenem treated rats. The high levels of TNF and IL-6 at t = 2 in the imipenem group were thought to be the result of early bacterial lysis, while the late increase in IL-6 levels in the aztreonam treated animals was thought to be the result of the formation of long bacterial filaments io the abdominal cavity. In the present study, treatment with taurolidine could not prevent or inhibit the release of endotoxin or IL-6, but taurolidine, alone or in combination with aztreonam, unexpectedly caused a dramatic increase in IL-6 levels which was associated with an increased acute mortality. We conclude that antibiotics can cause the release of endotoxin in spite of decreasing levels of bacteremia in vivo. It is suggested that circumstances in which antibiotic-induced filamentation occurs are also conditions that yield excessive (local) LPS release. Our data also suggest that there is a lack of relationship between plasma free endotoxin levels and mortality and that the most important inflammatory compartment was the abdominal cavity in this model
- …