43 research outputs found
Recommended from our members
Lung Injury Prediction Score for the Emergency Department: First Step Towards Prevention in Patients at Risk
Background: Early identification of patients at risk of developing acute lung injury (ALI) is critical for potential preventive strategies. We aimed to derive and validate an acute lung injury prediction score (EDLIPS) in a multicenter sample of emergency department (ED) patients. Methods: We performed a subgroup analysis of 4,361 ED patients enrolled in the previously reported multicenter observational study. ED risk factors and conditions associated with subsequent ALI development were identified and included in the EDLIPS model. Scores were derived and validated using logistic regression analyses. The model was assessed with the area under the receiver-operating curve (AUC) and compared to the original LIPS model (derived from a population of elective high-risk surgical and ED patients) and the Acute Physiology and Chronic Health Evaluation (APACHE II) score. Results: The incidence of ALI was 7.0% (303/4361). EDLIPS discriminated patients who developed ALI from those who did not with an AUC of 0.78 (95% CI 0.75, 0.82), better than the APACHE II AUC 0.70 (p ≤ 0.001) and similar to the original LIPS score AUC 0.80 (p = 0.07). At an EDLIPS cutoff of 5 (range −0.5, 15) positive and negative likelihood ratios (95% CI) for ALI development were 2.74 (2.43, 3.07) and 0.39 (0.30, 0.49), respectively, with a sensitivity 0.72(0.64, 0.78), specificity 0.74 (0.72, 0.76), and positive and negative predictive value of 0.18 (0.15, 0.21) and 0.97 (0.96, 0.98). Conclusion: EDLIPS may help identify patients at risk for ALI development early in the course of their ED presentation. This novel model may detect at-risk patients for treatment optimization and identify potential patients for ALI prevention trials
Fibrinogen Excretion in the Urine and Immunoreactivity in the Kidney Serves as a Translational Biomarker for Acute Kidney Injury
Fibrinogen (Fg) is significantly up-regulated in the kidney after acute kidney injury (AKI). We evaluated the performance of Fg as a biomarker for early detection of AKI. In rats and mice with kidney tubular damage induced by ischemia/reperfusion (I/R) or cisplatin administration, respectively; kidney tissue and urinary Fg increased significantly and correlated with histopathological injury, urinary kidney injury molecule-1 (KIM-1) and N-acetyl glucosaminidase (NAG) corresponding to the progression and regression of injury temporally. In a longitudinal follow-up of 31 patients who underwent surgical repair of abdominal aortic aneurysm, urinary Fg increased earlier than SCr in patients who developed postoperative AKI (AUC-ROC = 0.72). Furthermore, in a cohort of patients with biopsy-proven AKI (n = 53), Fg immunoreactivity in the tubules and interstitium increased remarkably and was able to distinguish patients with AKI from those without AKI (n = 59). These results suggest that immunoreactivity of Fg in the kidney, as well as urinary excretion of Fg, serves as a sensitive and early diagnostic translational biomarker for detection of AKI
Recommended from our members
Towards Prevention of Acute Lung Injury: Frequency and Outcomes of Emergency Department Patients At-Risk: A Multicenter Cohort Study
Background: Few emergency department (ED) evaluations on acute lung injury (ALI) have been carried out; hence, we sought to describe a cohort of hospitalized ED patients at risk for ALI development. Methods: Patients presenting to the ED with at least one predisposing condition to ALI were included in this study, a subgroup analysis of a multicenter observational cohort study (USCIITG-LIPS 1). Patients who met ALI criteria within 6 h of initial ED assessment, received end-of-life care, or were readmitted during the study period were excluded. Primary outcome was frequency of ALI development; secondary outcomes were ICU and hospital mortality. Results: Twenty-two hospitals enrolled 4,361 patients who were followed from the ED to hospital discharge. ALI developed in 303 (7.0 %) patients at a median onset of 2 days (IQR 2–5). Of the predisposing conditions, frequency of ALI development was highest in patients who had aortic surgery (43 %) and lowest in patients with pancreatitis (2.8 %). Compared to patients who did not develop ALI, those who did had higher ICU (24 % vs. 3.0 %, p < 0.001) and hospital (28 % vs. 4.6 %, p < 0.001) mortality, and longer hospital length of stay (16 vs. 5 days, p < 0.001). Among the 22 study sites, frequency of ALI development varied from less than 1 % to more than 12 % after adjustment for APACHE II. Conclusions: Seven percent of hospitalized ED patients with at least one predisposing condition developed ALI. The frequency of ALI development varied significantly according to predisposing conditions and across institutions. Further research is warranted to determine the factors contributing to ALI development
Detection of Drug-Induced Acute Kidney Injury in Humans Using Urinary KIM-1, miR-21,-200c, and-423
Drug-induced acute kidney injury (AKI) is often encountered in hospitalized patients. Although serum creatinine (SCr) is still routinely used for assessing AKI, it is known to be insensitive and nonspecific. Therefore, our objective was to evaluate kidney injury molecule 1 (KIM-1) in conjunction with microRNA (miR)-21, -200c, and -423 as urinary biomarkers for drug-induced AKI in humans. In a cross-sectional cohort of patients (n = 135) with acetaminophen (APAP) overdose, all 4 biomarkers were significantly (P < .004) higher not only in APAP-overdosed (OD) patients with AKI (based on SCr increase) but also in APAP-OD patients without clinical diagnosis of AKI compared with healthy volunteers. In a longitudinal cohort of patients with malignant mesothelioma receiving intraoperative cisplatin (Cp) therapy (n = 108) the 4 biomarkers increased significantly (P < .0014) over time after Cp administration, but could not be used to distinguish patients with or without AKI. Evidence for human proximal tubular epithelial cells (HPTECs) being the source of miRNAs in urine was obtained first, by in situ hybridization based confirmation of increase in miR-21 expression in the kidney sections of AKI patients and second, by increased levels of miR-21, -200c, and -423 in the medium of cultured HPTECs treated with Cp and 4-aminophenol (APAP degradation product). Target prediction analysis revealed 1102 mRNA targets of miR-21, -200c, and -423 that are associated with pathways perturbed in diverse pathological kidney conditions. In summary, we report noninvasive detection of AKI in humans by combining the sensitivity of KIM-1 along with mechanistic potentials of miR-21, -200c, and -423
Recommended from our members
Postoperative pulmonary complications with adjuvant regional anesthesia versus general anesthesia alone: a sub-analysis of the Perioperative Research Network study
Background
Adjuvant regional anesthesia is often selected for patients or procedures with high risk of pulmonary complications after general anesthesia. The benefit of adjuvant regional anesthesia to reduce postoperative pulmonary complications remains uncertain. In a prospective observational multicenter study, patients scheduled for non-cardiothoracic surgery with at least one postoperative pulmonary complication surprisingly received adjuvant regional anesthesia more frequently than those with no complications. We hypothesized that, after adjusting for surgical and patient complexity variables, the incidence of postoperative pulmonary complications would not be associated with adjuvant regional anesthesia.
Methods
We performed a secondary analysis of a prospective observational multicenter study including 1202 American Society of Anesthesiologists physical status 3 patients undergoing non-cardiothoracic surgery. Patients were classified as receiving either adjuvant regional anesthesia or general anesthesia alone. Predefined pulmonary complications within the first seven postoperative days were prospectively identified. Groups were compared using bivariable and multivariable hierarchical logistic regression analyses for the outcome of at least one postoperative pulmonary complication.
Results
Adjuvant regional anesthesia was performed in 266 (22.1%) patients and not performed in 936 (77.9%). The incidence of postoperative pulmonary complications was greater in patients receiving adjuvant regional anesthesia (42.1%) than in patients without it (30.9%) (site adjusted p = 0.007), but this association was not confirmed after adjusting for covariates (adjusted OR 1.37; 95% CI, 0.83–2.25; p = 0.165).
Conclusion
After adjusting for surgical and patient complexity, adjuvant regional anesthesia versus general anesthesia alone was not associated with a greater incidence of postoperative pulmonary complications in this multicenter cohort of non-cardiothoracic surgery patients
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Liver in sepsis and systemic inflammatory response syndrome
In patients with sepsis and SIRS, the liver has two opposing roles: a source of inflammatory mediators and a target organ for the effects of the inflammatory mediators. The liver is pivotal in modulating the systemic response to severe infection, because it contains the largest mass of macrophages (Kupffer cells) in the body; these macrophages can clear the endotoxin and bacteria that initiate the systemic inflammatory response. This article summarizes the functional changes that take place in the liver during sepsis and systemic inflammatory response syndrome and discusses the cellular and molecular mechanisms that underlie clinical outcomes
Recommended from our members
Heparin requirements for full anticoagulation are higher for patients on dabigatran than for those on warfarin – a model-based study
Purpose Dabigatran (D) is increasingly used for chronic anticoagulation in place of warfarin (W). These patients may present for catheter-based procedures requiring full anticoagulation with heparin. This study compares the heparin sensitivity of patients previously on dabigatran, on warfarin, or on no chronic anticoagulant during ablation of atrial fibrillation. Patients and methods In a retrospective study of patients treated with D, W, or neither drug (N) undergoing atrial ablation, the timing of heparin doses and resulting activated clotting times were collected. First, the initial activated clotting time response to the first heparin bolus was compared. Then, a non-linear mixed effects modelling (NONMEM) analysis was performed, fitting a pharmacokinetic and -dynamic model to the entire anticoagulation course of each patient. Resulting model coefficients were used to compare the different patient groups. Results: Data for 66 patients on dabigatran, 95 patients on warfarin, and 27 patients on no anticoagulation were retrieved. The last dose of dabigatran or warfarin had occurred 27 hours and 15 hours before the procedure. Groups D and N both responded significantly less (P<0.05) to the initial heparin bolus than Group W (approximately 50%). Likewise, the model coefficients resulting from the fit to each group reflected a significantly lower heparin sensitivity in groups D and N compared to W. Clearances of the heparin effect in the model did not differ significantly among groups. Conclusion: Patients on warfarin with an average INR of 1.5 or higher are more sensitive to heparin than patients not previously anticoagulated or patients who discontinued dabigatran 27 hours earlier (approximately two half-lives) warfarin