7 research outputs found
An International Prospective Cohort Study To Validate 2 Prediction Rules for Infections Caused by Third-generation Cephalosporin-resistant Enterobacterales
Background
The possibility of bloodstream infections caused by third-generation cephalosporin-resistant Enterobacterales (3GC-R-BSI) leads to a trade-off between empiric inappropriate treatment (IAT) and unnecessary carbapenem use (UCU). Accurately predicting 3GC-R-BSI could reduce IAT and UCU. We externally validate 2 previously derived prediction rules for community-onset (CO) and hospital-onset (HO) suspected bloodstream infections.
Methods
In 33 hospitals in 13 countries we prospectively enrolled 200 patients per hospital in whom blood cultures were obtained and intravenous antibiotics with coverage for Enterobacterales were empirically started. Cases were defined as 3GC-R-BSI or 3GC-R gram-negative infection (3GC-R-GNI) (analysis 2); all other outcomes served as a comparator. Model discrimination and calibration were assessed. Impact on carbapenem use was assessed at several cutoff points.
Results
4650 CO infection episodes were included and the prevalence of 3GC-R-BSI was 2.1% (n = 97). IAT occurred in 69 of 97 (71.1%) 3GC-R-BSI and UCU in 398 of 4553 non–3GC-R-BSI patients (8.7%). Model calibration was good, and the AUC was .79 (95% CI, .75–.83) for 3GC-R-BSI. The prediction rule potentially reduced IAT to 62% (60/97) while keeping UCU comparable at 8.4% or could reduce UCU to 6.3% (287/4553) while keeping IAT equal. IAT and UCU in all 3GC-R-GNIs (analysis 2) improved at similar percentages. 1683 HO infection episodes were included and the prevalence of 3GC-R-BSI was 4.9% (n = 83). Here model calibration was insufficient.
Conclusions
A prediction rule for CO 3GC-R infection was validated in an international cohort and could improve empirical antibiotic use. Validation of the HO rule yielded suboptimal performance
Clustering COVID-19 ARDS patients through the first days of ICU admission. An analysis of the CIBERESUCICOVID Cohort
Background Acute respiratory distress syndrome (ARDS) can be classified into sub-phenotypes according to different inflammatory/clinical status. Prognostic enrichment was achieved by grouping patients into hypoinflammatory or hyperinflammatory sub-phenotypes, even though the time of analysis may change the classification according to treatment response or disease evolution. We aimed to evaluate when patients can be clustered in more than 1 group, and how they may change the clustering of patients using data of baseline or day 3, and the prognosis of patients according to their evolution by changing or not the cluster.Methods Multicenter, observational prospective, and retrospective study of patients admitted due to ARDS related to COVID-19 infection in Spain. Patients were grouped according to a clustering mixed-type data algorithm (k-prototypes) using continuous and categorical readily available variables at baseline and day 3.Results Of 6205 patients, 3743 (60%) were included in the study. According to silhouette analysis, patients were grouped in two clusters. At baseline, 1402 (37%) patients were included in cluster 1 and 2341(63%) in cluster 2. On day 3, 1557(42%) patients were included in cluster 1 and 2086 (57%) in cluster 2. The patients included in cluster 2 were older and more frequently hypertensive and had a higher prevalence of shock, organ dysfunction, inflammatory biomarkers, and worst respiratory indexes at both time points. The 90-day mortality was higher in cluster 2 at both clustering processes (43.8% [n = 1025] versus 27.3% [n = 383] at baseline, and 49% [n = 1023] versus 20.6% [n = 321] on day 3). Four hundred and fifty-eight (33%) patients clustered in the first group were clustered in the second group on day 3. In contrast, 638 (27%) patients clustered in the second group were clustered in the first group on day 3.Conclusions During the first days, patients can be clustered into two groups and the process of clustering patients may change as they continue to evolve. This means that despite a vast majority of patients remaining in the same cluster, a minority reaching 33% of patients analyzed may be re-categorized into different clusters based on their progress. Such changes can significantly impact their prognosis
An International Prospective Cohort Study To Validate 2 Prediction Rules for Infections Caused by Third-generation Cephalosporin-resistant Enterobacterales
The possibility of bloodstream infections caused by 3rd-generation cephalosporin-resistant Enterobacterales (3GC-R-BSI) leads to a trade-off between empiric inappropriate treatment (IAT) and unnecessary carbapenem use (UCU). Accurately predicting 3GC-R-BSI could reduce IAT and UCU. We externally validate two previously derived prediction rules for community-onset (CO) and hospital-onset (HO) suspected bloodstream infections
Agreement Between Different Methodologies for Non-Invasive p.T790M and EGFR Sensitizing Mutation Testing
Background. Tyrosine kinase inhibitors (TKIs) are the current standard of care for patients with advanced EGFR-mutant non-small cell lung cancer (NSCLC). However, most patients progressed within 1 to 2 years. The EGFR p.T790M mutation is the most common resistance mechanism to first and second generation EGFR TKIs. The identification of p.T790M mutation is of considerable clinical relevance as osimertinib has demonstrated clinical efficacy in this setting. Guidelines recommend testing for the p.T790M mutation in blood at relapse to TKIs, and re-biopsy only in case of a negative result. Several blood based methodologies for detection of EGFR mutations have been developed in the recent years. However, the number of comparison studies between platforms is very limited.
Method. This is a multicenter, cross-sectional study (ClinicalTrials.gov Identifier: NCT03363139) performed by the Spanish Lung Cancer Group. Samples from 75 consecutive EGFR mutant NSCLC patients were collected at disease progression to first line TKI treatment. The presence of EGFR mutations in the cfDNA was evaluated in 39 samples by 7 methodologies, namely: Cobas® EGFR Mutation Test v2 (Roche Diagnostics), Therascreen EGFR Plasma RGQ PCR Kit (Qiagen), QuantStudio® 3D Digital PCR System (Thermofisher), a 5′-nuclease real-time PCR (TaqMan®) assay in presence of PNA, OncoBEAM EGFR (Sysmex Inostics), NGS with two different gene panels: Oncomine® (Thermofisher) and Lung Cancer Panel (Qiagen). The agreement between methodologies was assessed using the kappa coefficient (K) and its corresponding 95% confidence intervals (95% CI). For quantitative variables the concordance correlation coefficient (ccc) was used.
Result. Complete results are available for 39 patients. Overall, the agreement between all methodologies for the detection of p.T790M mutation as well as the original EGFR sensitizing mutation was good (K=0.669; 95CI: 0.504-0.835 and K=0.750 95CI: 0.599-0.899 respectively). Remarkably, the agreement between FDA-approved methodologies for p.T790M detection was almost perfect (K=0.926; 95CI: 0.712-1) and good for the EGFR sensitizing mutations (K=0.657; 95CI: 0.417-0.902). Similarly, the agreement between NGS-based methodologies for the detection of p.T790M and the EGFR activating mutations was very high (K=0.843; 95CI: 0.567-1 and K=0.872 95CI: 0.595-1 respectively). Moreover, concordance between both technologies for p.T790M and EGFR sensitizing mutation mutant allele frequency was excellent (ccc=0.956; 95CI: 0.906-1 and ccc=0.980 95CI: 0.950-1 respectively). The proportion of samples that were positive for p.T790M detection varied from 28% (PCR based technologies) to 37% depending on the methodology.
Conclusion. NGS and PCR-based methodologies show a good to excellent agreement for the detection of EGFR mutations, including the p.T790M. Our results support the use of liquid biopsies for non-invasive testing of clinically relevant mutations (Data from the whole cohort will be presented at the meeting)
Evaluation of Nutritional Practices in the Critical Care patient (The ENPIC study) : Does nutrition really affect ICU mortality?
The importance of artificial nutritional therapy is underrecognized, typically being considered an adjunctive rather than a primary therapy. We aimed to evaluate the influence of nutritional therapy on mortality in critically ill patients. Methods: This multicenter prospective observational study included adult patients needing artificial nutritional therapy for >48 h if they stayed in one of 38 participating intensive care units for ≥72 h between April and July 2018. Demographic data, comorbidities, diagnoses, nutritional status and therapy (type and details for ≤14 days), and outcomes were registered in a database. Confounders such as disease severity, patient type (e.g., medical, surgical or trauma), and type and duration of nutritional therapy were also included in a multivariate analysis, and hazard ratios (HRs) and 95% confidence intervals (95%CIs) were reported. We included 639 patients among whom 448 (70.1%) and 191 (29.9%) received enteral and parenteral nutrition, respectively. Mortality was 25.6%, with non-survivors having the following characteristics: older age; more comorbidities; higher Sequential Organ Failure Assessment (SOFA) scores (6.6 ± 3.3 vs 8.4 ± 3.7; P < 0.001); greater nutritional risk (Nutrition Risk in the Critically Ill [NUTRIC] score: 3.8 ± 2.1 vs 5.2 ± 1.7; P < 0.001); more vasopressor requirements (70.4% vs 83.5%; P=0.001); and more renal replacement therapy (12.2% vs 23.2%; P=0.001). Multivariate analysis showed that older age (HR: 1.023; 95% CI: 1.008-1.038; P=0.003), higher SOFA score (HR: 1.096; 95% CI: 1.036-1.160; P=0.001), higher NUTRIC score (HR: 1.136; 95% CI: 1.025-1.259; P=0.015), requiring parenteral nutrition after starting enteral nutrition (HR: 2.368; 95% CI: 1.168-4.798; P=0.017), and a higher mean Kcal/Kg/day intake (HR: 1.057; 95% CI: 1.015-1.101; P=0.008) were associated with mortality. By contrast, a higher mean protein intake protected against mortality (HR: 0.507; 95% CI: 0.263-0.977; P=0.042). Old age, higher organ failure scores, and greater nutritional risk appear to be associated with higher mortality. Patients who need parenteral nutrition after starting enteral nutrition may represent a high-risk subgroup for mortality due to illness severity and problems receiving appropriate nutritional therapy. Mean calorie and protein delivery also appeared to influence outcomes. ClinicaTrials.gov NCT: 03634943
Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study
© 2022 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licenseBackground: Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide. Methods: A multimethods analysis was performed as part of the GlobalSurg 3 study—a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital. Findings: Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3·85 [95% CI 2·58–5·75]; p<0·0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63·0% vs 82·7%; OR 0·35 [0·23–0·53]; p<0·0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer. Interpretation: Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised. Funding: National Institute for Health and Care Research
Global variation in postoperative mortality and complications after cancer surgery: a multicentre, prospective cohort study in 82 countries
© 2021 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY-NC-ND 4.0 licenseBackground: 80% of individuals with cancer will require a surgical procedure, yet little comparative data exist on early outcomes in low-income and middle-income countries (LMICs). We compared postoperative outcomes in breast, colorectal, and gastric cancer surgery in hospitals worldwide, focusing on the effect of disease stage and complications on postoperative mortality. Methods: This was a multicentre, international prospective cohort study of consecutive adult patients undergoing surgery for primary breast, colorectal, or gastric cancer requiring a skin incision done under general or neuraxial anaesthesia. The primary outcome was death or major complication within 30 days of surgery. Multilevel logistic regression determined relationships within three-level nested models of patients within hospitals and countries. Hospital-level infrastructure effects were explored with three-way mediation analyses. This study was registered with ClinicalTrials.gov, NCT03471494. Findings: Between April 1, 2018, and Jan 31, 2019, we enrolled 15 958 patients from 428 hospitals in 82 countries (high income 9106 patients, 31 countries; upper-middle income 2721 patients, 23 countries; or lower-middle income 4131 patients, 28 countries). Patients in LMICs presented with more advanced disease compared with patients in high-income countries. 30-day mortality was higher for gastric cancer in low-income or lower-middle-income countries (adjusted odds ratio 3·72, 95% CI 1·70–8·16) and for colorectal cancer in low-income or lower-middle-income countries (4·59, 2·39–8·80) and upper-middle-income countries (2·06, 1·11–3·83). No difference in 30-day mortality was seen in breast cancer. The proportion of patients who died after a major complication was greatest in low-income or lower-middle-income countries (6·15, 3·26–11·59) and upper-middle-income countries (3·89, 2·08–7·29). Postoperative death after complications was partly explained by patient factors (60%) and partly by hospital or country (40%). The absence of consistently available postoperative care facilities was associated with seven to 10 more deaths per 100 major complications in LMICs. Cancer stage alone explained little of the early variation in mortality or postoperative complications. Interpretation: Higher levels of mortality after cancer surgery in LMICs was not fully explained by later presentation of disease. The capacity to rescue patients from surgical complications is a tangible opportunity for meaningful intervention. Early death after cancer surgery might be reduced by policies focusing on strengthening perioperative care systems to detect and intervene in common complications. Funding: National Institute for Health Research Global Health Research Unit