43 research outputs found

    Troponin T in COVID-19 hospitalized patients: Kinetics matter

    Get PDF
    Background: Coronavirus disease 2019 (COVID-19) emerged as a worldwide health crisis, overwhelming healthcare systems. Elevated cardiac troponin T (cTn T) at admission was associated with increased in-hospital mortality. However, data addressing the role of cTn T in major adverse cardiovascular events (MACE) in COVID-19 are scarce. Therefore, we assessed the role of baseline cTn T and cTn T kinetics for MACE and in-hospital mortality prediction in COVID-19.Methods: Three hundred and ten patients were included prospectively. One hundred and eight patients were excluded due to incomplete records. Patients were divided into three groups according to cTn T kinetics: ascending, descending, and constant. The cTn T slope was defined as the ratio of the cTn T change over time. The primary and secondary endpoints were MACE and in-hospital mortality.Results: Two hundred and two patients were included in the analysis (mean age 64.4 ± 16.7 years, 119 [58.9%] males). Mean duration of hospitalization was 14.0 ± 12.3 days. Sixty (29.7%) patients had MACE, and 40 (19.8%) patients died. Baseline cTn T predicted both endpoints (p = 0.047, hazard ratio [HR] 1.805, 95% confidence interval [CI] 1.009–3.231; p = 0.009, HR 2.322, 95% CI 1.234–4.369). Increased cTn T slope predicted mortality (p = 0.041, HR 1.006, 95% CI 1.000–1.011). Constant cTn T was associated with lower MACE and mortality (p = 0.000, HR 3.080, 95% CI 1.914–4.954, p = 0.000, HR 2.851, 95% CI 1.828–4.447).Conclusions: The present study emphasizes the additional role of cTn T testing in COVID-19 patients for risk stratification and improved diagnostic pathway and management

    Effect of anti-interleukin drugs in patients with COVID-19 and signs of cytokine release syndrome (COV-AID): a factorial, randomised, controlled trial.

    Full text link
    BACKGROUND: Infections with SARS-CoV-2 continue to cause significant morbidity and mortality. Interleukin (IL)-1 and IL-6 blockade have been proposed as therapeutic strategies in COVID-19, but study outcomes have been conflicting. We sought to study whether blockade of the IL-6 or IL-1 pathway shortened the time to clinical improvement in patients with COVID-19, hypoxic respiratory failure, and signs of systemic cytokine release syndrome. METHODS: We did a prospective, multicentre, open-label, randomised, controlled trial, in hospitalised patients with COVID-19, hypoxia, and signs of a cytokine release syndrome across 16 hospitals in Belgium. Eligible patients had a proven diagnosis of COVID-19 with symptoms between 6 and 16 days, a ratio of the partial pressure of oxygen to the fraction of inspired oxygen (PaO(2):FiO(2)) of less than 350 mm Hg on room air or less than 280 mm Hg on supplemental oxygen, and signs of a cytokine release syndrome in their serum (either a single ferritin measurement of more than 2000 μg/L and immediately requiring high flow oxygen or mechanical ventilation, or a ferritin concentration of more than 1000 μg/L, which had been increasing over the previous 24 h, or lymphopenia below 800/mL with two of the following criteria: an increasing ferritin concentration of more than 700 μg/L, an increasing lactate dehydrogenase concentration of more than 300 international units per L, an increasing C-reactive protein concentration of more than 70 mg/L, or an increasing D-dimers concentration of more than 1000 ng/mL). The COV-AID trial has a 2 × 2 factorial design to evaluate IL-1 blockade versus no IL-1 blockade and IL-6 blockade versus no IL-6 blockade. Patients were randomly assigned by means of permuted block randomisation with varying block size and stratification by centre. In a first randomisation, patients were assigned to receive subcutaneous anakinra once daily (100 mg) for 28 days or until discharge, or to receive no IL-1 blockade (1:2). In a second randomisation step, patients were allocated to receive a single dose of siltuximab (11 mg/kg) intravenously, or a single dose of tocilizumab (8 mg/kg) intravenously, or to receive no IL-6 blockade (1:1:1). The primary outcome was the time to clinical improvement, defined as time from randomisation to an increase of at least two points on a 6-category ordinal scale or to discharge from hospital alive. The primary and supportive efficacy endpoints were assessed in the intention-to-treat population. Safety was assessed in the safety population. This study is registered online with ClinicalTrials.gov (NCT04330638) and EudraCT (2020-001500-41) and is complete. FINDINGS: Between April 4, and Dec 6, 2020, 342 patients were randomly assigned to IL-1 blockade (n=112) or no IL-1 blockade (n=230) and simultaneously randomly assigned to IL-6 blockade (n=227; 114 for tocilizumab and 113 for siltuximab) or no IL-6 blockade (n=115). Most patients were male (265 [77%] of 342), median age was 65 years (IQR 54-73), and median Systematic Organ Failure Assessment (SOFA) score at randomisation was 3 (2-4). All 342 patients were included in the primary intention-to-treat analysis. The estimated median time to clinical improvement was 12 days (95% CI 10-16) in the IL-1 blockade group versus 12 days (10-15) in the no IL-1 blockade group (hazard ratio [HR] 0·94 [95% CI 0·73-1·21]). For the IL-6 blockade group, the estimated median time to clinical improvement was 11 days (95% CI 10-16) versus 12 days (11-16) in the no IL-6 blockade group (HR 1·00 [0·78-1·29]). 55 patients died during the study, but no evidence for differences in mortality between treatment groups was found. The incidence of serious adverse events and serious infections was similar across study groups. INTERPRETATION: Drugs targeting IL-1 or IL-6 did not shorten the time to clinical improvement in this sample of patients with COVID-19, hypoxic respiratory failure, low SOFA score, and low baseline mortality risk. FUNDING: Belgian Health Care Knowledge Center and VIB Grand Challenges program

    Presumed Urinary Tract Infection in Patients Admitted with COVID-19: Are We Treating Too Much?

    No full text
    Despite the low rates of bacterial co-/superinfections in COVID-19 patients, antimicrobial drug use has been liberal since the start of the COVID-19 pandemic. Due to the low specificity of markers of bacterial co-/superinfection in the COVID-19 setting, overdiagnosis and antimicrobial overprescription have become widespread. A quantitative and qualitative evaluation of urinary tract infection (UTI) diagnoses and antimicrobial drug prescriptions for UTI diagnoses was performed in patients admitted to the COVID-19 ward of a university hospital between 17 March and 2 November 2020. A team of infectious disease specialists performed an appropriateness evaluation for every diagnosis of UTI and every antimicrobial drug prescription covering a UTI. A driver analysis was performed to identify factors increasing the odds of UTI (over)diagnosis. A total of 622 patients were included. UTI was present in 13% of included admissions, and in 12%, antimicrobials were initiated for a UTI diagnosis (0.71 daily defined doses (DDDs)/admission; 22% were scored as ‘appropriate’). An evaluation of UTI diagnoses by ID specialists revealed that of the 79 UTI diagnoses, 61% were classified as probable overdiagnosis related to the COVID-19 hospitalization. The following factors were associated with UTI overdiagnosis: physicians who are unfamiliar working in an internal medicine ward, urinary incontinence, mechanical ventilation and female sex. Antimicrobial stewardship teams should focus on diagnostic stewardship of UTIs, as UTI overdiagnosis seems to be highly prevalent in admitted COVID-19 patients

    Beyond Guidelines and Reports on Bacterial Co-/Superinfections in the Context of COVID-19: Why Uniformity Matters

    No full text
    Background: In the period following the declaration of the COVID-19 pandemic, more evidence became available on the epidemiology of bacterial co-/superinfections (bCSs) in hospitalized COVID-19 patients. Various European therapeutic guidelines were published, including guidance on rational antibiotic use. Methods: In this letter to the editor, we provide an overview of the largest meta-analyses or prospective studies reporting on bCS rates in COVID-19 patients and discuss why the reader should interpret the results of those reports with care. Moreover, we compare different national and international COVID-19 therapeutic guidelines from countries of the European Union. Specific attention is paid to guidance dedicated to rational antibiotic use. Results: We found a significant heterogeneity in studies reporting on the epidemiology of bCSs in COVID-19 patients. Moreover, European national and international guidelines differ strongly from each other, especially with regard to the content and extent of antibiotic guidance in hospitalized COVID-19 patients. Conclusion: A standardized way of reporting on bCSs and uniform European guidelines on rational antibiotic use in COVID-19 patients are crucial for antimicrobial stewardship teams to halt unnecessary antibiotic use in the COVID-19 setting

    Effects of environmental conditions on growth of Stagonosporopsis cucurbitacearum causing internal fruit rot in cucurbits

    No full text
    The Cucurbitaceae are a large and diverse family containing several important commodity crops in many parts of the world. In recent years, fruit rot caused by Stagonosporopsis spp. became a major disease in both field-grown and greenhouse-grown cucurbits. Yield losses due to Stagonosporopsis can show seasonal peaks up to 30%. Despite its economic importance, information on growth characteristics of S. cucurbitacearum is limited. A more profound understanding of the influence of individual environmental factors on growth of the fungus is a first step toward the development of sustainable management strategies to prevent outbreaks of this disease. Optimal growth of the pathogen occurred in temperatures ranging from 20 to 25°C, and in a neutral and acid (pH 4) environment. Although S. cucurbitacearum is described as an aerobic fungus, it still showed considerable mycelium growth at low oxygen concentrations.status: publishe

    Intratest reliability and test-retest reproducibility of the oxygen uptake efficiency slope in healthy participants

    No full text
    Background The oxygen uptake efficiency slope (OUES) is a newer ventilatory exercise parameter, used in the evaluation of healthy participants and patients with cardiovascular disease. However, few data about the reliability and reproducibility of OUES are available. Our study assessed intratest reliability and test-retest reproducibility of OUES in healthy participants. Design and methods Eighteen participants (age 28 6 years, BMI 22.1 +/- 1.9kg/m(2), 10 men) performed two identical maximal exercise tests on a bicycle ergometer. To assess test-retest reproducibility, we performed Bland-Altman analysis and calculated the coefficient of repeatability of the main ventilatory variables. Results OUES remained stable during the second part of the exercise test. Mean values varied 2.4 +/- 4.0% between OUES calculated at 70% (OUES70) and at 100% of exercise duration. Mean variation decreased to 1.4 +/- 2.3% when OUES was calculated at 90% of exercise duration (OUES90). The Bland-Altman 95% limits of agreement for OUES90 were +3 and -6%, those for OUES70 were +11 and -8%. The coefficient of repeatability for OUES was 597 ml/min or 18.7% of the average value of repeated OUES measurements. These results were similar to those of peak oxygen uptake and minute ventilation/carbon dioxide output. However, the test-retest reproducibility for submaximal-derived values of OUES was lower, as we noted higher coefficients of repeatability for OUES90 and OUES70, increasing up to 27% of the average of repeated values. Conclusion OUES shows excellent intratest reliability and has a test-retest reproducibility that is similar to that of peak oxygen uptake and minute ventilation/carbon dioxide output slope. However, its reproducibility becomes higher when it is calculated from increasing levels of achieved exercise intensity

    Biological control of Fusarium spp. in bell pepper fruit using Gliocladium species

    No full text
    Despite the rising popularity of high quality colored bell peppers, market growth is currently threatened due to internal fruit rot caused mainly by the fungus Fusarium lactis (FLASC), which causes yield losses of 5% with seasonal peaks up to 50%. Although the disease has emerged as a significant threat to bell pepper production, adequate chemical or biological control measures are lacking. Moreover, Belgian pepper production has an overall low impact on the environment with respect to fungicidal use. Therefore, the need for new biocontrol agents (BCA) to tackle internal fruit rot is urgent as bell pepper growers strive to produce low residue fruit. Hence, more than 100 strains of potential antagonistic fungi were screened for mycelial inhibition of FLASC by employment of an adapted dual culture in vitro selection. The main criteria for BCA selection were at least 20% inhibition of mycelial growth after two days of in vitro growth and sporulation quantities exceeding 107 spores mL-1 after one week of growth on potato dextrose medium. After screening, the best candidates were further evaluated in greenhouse trials during three consecutive years. Both screening methods resulted in the selection of two potential isolates of Gliocladium roseum which significantly reduced infections over the three years of field trials. Although these BCAs proved to be effective against internal fruit rot in bell pepper, further screenings should be carried out to investigate safety, environmental risks and ecological characteristics.status: publishe
    corecore