9 research outputs found

    Research Staff COVID-19 Pandemic Survey-Results from the Prevention and Early Treatment of Acute Lung Injury (PETAL) Network

    Get PDF
    Objectives: There is a lack of knowledge about the challenges of researchers who continued in-person research during the early phases of the COVID-19 pandemic. Design: Electronic survey assessing work-related exposure to COVID-19, logistical challenges, and procedural changes during the first year of the COVID-19 pandemic on clinical research. Setting: National Heart, Lung, and Blood Institute-sponsored Prevention and Early Treatment of Acute Lung Injury Clinical Trial Network Centers. Subjects: Research staff at research Network Sites. Measurements and Main Results: The 37-question survey was completed by 277 individuals from 24 states between 29 September 2020, and 12 December 2020, yielding a response rate of 37.7%. Most respondents (91.5%) indicated that non-COVID-19 research was affected by COVID-19 research studies. In response to the COVID-19 pandemic, 20% of respondents were reassigned to different roles at their institution. Many survey takers were exposed to COVID-19 (56%), with more than 50% of researchers requiring a COVID-19 test and 8% testing positive. The fear of infection was 2.7-times higher compared to pre-COVID-19 times. Shortages of personal protective equipment were encountered by 34% of respondents, primarily due to lack of access to N95 masks, followed by gowns and protective eyewear. Personal protective equipment reallocation from research to clinical use was reported by 31% of respondents. Most of the respondents (88.5%), despite these logistical challenges, indicated their willingness to enroll COVID-19 patients. Conclusions: During the first year of the COVID-19 pandemic, members of the research network were engaged in COVID-19 research despite logistical challenges, limited access to personal protective equipment, and fear of exposure. The research network’s survey experience can inform ongoing policy discussions to create research enterprises that can dexterously refocus research to address the knowledge gaps associated with novel public health emergencies while mitigating the effect of pandemics on existing research projects and research personnel

    Revisiting the impact of endotoxin clearance on survival in the euphrates trial

    No full text
    Introduction: Sepsis is a diverse disease. The EUPHRATES TRIAL conducted in a subset of septic shock patients with elevated endotoxin activity levels (EAA) at baseline greater than 0.6 Units were randomized to receive Polymyxin B Hemoperfusion (PMXBHP) or a SHAM treatment. PMXBHP targets the reduction of circulating endotoxin, thought to be a continued trigger for a dysregulated host response. PMXBHP may aid reduction in EAA levels and treatment responders may have improved survival rates. Methods: This is a post hoc analysis of the EUPHRATES trial focusing on dynamic changes in EAA levels examined at time points (Baseline, Day 1, Day 2, and Day 3). To evaluate changes between the time points the endotoxin clearance (EAAC) was calculated [EAAC = (EAA Time1- EAA Time2)/ EAA Time1]. Values \u3c 0% indicate increasing EAA values, values \u3e 0% indicate reduction in measured EAA values. Chi2 test was used to compare mortality between increasing versus decreasing EAA values. P values of 0.05 were considered significant. Results: The primary end-point analysis of EUPHRATES did not show a mortality benefit with PMXBHP for all patients enrolled or patients with a Multiple Organ Dysfunction Syndrome (MODS) score of \u3e 9. By using an EAAC assessments approach dynamic EAA changes can be observed. Positive EAAC indicates a response to PMXBHP from day 2 to day 3 and has a positive impact on survival. There is a significant 12.1% difference in mortality between the PMXBHP responders and PMXBHP non-responders. (Table 1) Increasing EAA (negative EEAC) has higher mortality in the PMXBHP and the Sham treated patients. PMXBHP treated patients with positive EAAC from day 2 to 3 have a 9.3% lower 28-day mortality compared to SHAM patients with positive EAAC. This difference in mortality is however not statistical significant with a Chi2 of 2.64 and a p-value of 0.10. Conclusions: All EUPHRATES trial patients regardless of MODS score, who responded to treatment with PMXBHP with reduction in EAA levels from day 2 to day 3, have an impactful mortality reduction of 12.1% over the non-responders treated with PMXBHP. The observed mortality benefit between PMXBHP and SHAM group with positive EAAC did not reach statistical significance and may be based on the small subgroup size. Future research should focus on optimization of EAA dynamics and optimization of EAA clearance rates do aid resolution of the dysregulated host response

    Oxygen extraction and perfusion markers in severe sepsis and septic shock: diagnostic, therapeutic and outcome implications.

    No full text
    PURPOSE OF REVIEW: The purpose of this study is to review the recent literature examining the clinical utility of markers of systemic oxygen extraction and perfusion in the diagnosis, treatment and prognosis of severe sepsis and septic shock. RECENT FINDINGS: When sepsis is accompanied by conditions in which systemic oxygen delivery does not meet tissue oxygen demands, tissue hypoperfusion begins. Tissue hypoperfusion leads to oxygen debt, cellular injury, organ dysfunction and death. Tissue hypoperfusion can be characterized using markers of tissue perfusion (central venous oxygen saturation and lactate), which reflect the interaction between systemic oxygen delivery and demands. For the last two decades, studies and quality initiatives incorporating the early detection and interruption of tissue hypoperfusion have been shown to improve mortality and altered sepsis care. Three recent trials, while confirming an all-time improvement in sepsis mortality, challenged the concept that rapid normalization of markers of perfusion confers outcome benefit. By defining and comparing haemodynamic phenotypes using markers of tissue perfusion, we may better understand which patients are more likely to benefit from early goal-directed haemodynamic optimization. SUMMARY: The phenotypic haemodynamic characterization of patients using perfusion markers has diagnostic, therapeutic and outcome implications in severe sepsis and septic shock. However, irrespective of haemodynamic phenotype, the outcome reflects the quality of care provided at the point of presentation. Utilizing these principles may allow more objective interpretation of resuscitation trials and translate these findings into current practice

    Emergency Department Triage Blood Glucose Levels: Outcomes Implications in Patients with Severe Sepsis and Septic Shock

    No full text
    Background: Patients with severe sepsis and septic shock often present with a variety of organ dysfunctions including metabolic derangements. The appropriate metabolic stress response in sepsis includes release of glucose leading to stress-hyperglycemia and is commonly seen in these Emergency Department (ED) patients. Many studies focus on metabolic glucose abnormalities and its effect on outcomes at the time of Intensive Care Unit (ICU) admission. Hyperglycemia present on ICU admission has been associated with adverse outcomes irrespective of the presence or absence of diabetes mellitus. Methods: We analyzed our ED quality sepsis database in concern to triage glucose levels and associated 30 day mortality from August 2015 to October 2016 to determine adjustments in active glucose monitoring in the ED. Results: We identified 683 patients with severe sepsis (N=399) and septic shock (N=284). Average glucose levels at the 1stED laboratory evaluation was 172 mg/ dL (SD=149). Patients with septic shock had on average lower glucose levels (170 mg/dL) than patients with severe sepsis (174 mg/dL). Sepsis survivors had higher triage glucose (176 mg/dL, N=525) than non-survivors (159 mg/dL, N=157). When stratifying patients by glucose levels, we found that patients with glucose levels less than 70 mg/dL at ED triage had the highest mortality. The incidence of glucose of ≤ 70 mg/ dL was 7% (N=49) for all patients with severe sepsis and septic shock combined. The mortality in this group was 44% (21/49) which was significantly (p=0.001) higher than mortality in patients with higher glucose levels (136/634, 21%). In patients with glucose levels of ≥ 180 mg/dL the mortality was not different (35/177, 26%, p=0.9) when compared to patients with glucose levels ranging from 70-180 mg/dL. Conclusion: Glucose monitoring for patients with sepsis in the ED aids recognition of correctable metabolic derangements early in management. In the ED, the metabolic-stress response to sepsis is commonly stress-hyperglycemia, but hypoglycemia can also occur in the early phases of sepsis. Hypoglycemia at ED triage has a higher than expected mortality and needs to be recognized and treated accordingly

    Sepsis-3 the Dysregulated Host Response and Cytokine Changes

    No full text
    RATIONALE: New Sepsis-3 definitions have been published, stating that sepsis is a “life-threatening organ dysfunction caused by a dysregulated host response to infection”. The Sepsis-3 authors identified that “Limitations of previous definitions included an excessive focus on inflammation and inadequate specificity and sensitivity of the systemic inflammatory response syndrome (SIRS) criteria.” With the new definitions it was proposed that identification of septic patients with a dysregulated host response is aided by the use of a quick SOFA (qSOFA) score and not by the use of SIRS criteria. qSOFA score is considered abnormal when 2 of 3 criteria are met. These criteria include sytolic blood pressure \u3c 100mmHg, respiratory rate \u3e 24 and mental status changes. We re-examined an established data-base for patients with vasopressor dependent septic shock comparing SIRS criteria to qSOFA criteria in concern to available measured cytokine markers as indicator of a dysregulated host response. If qSOFA indicates an abnormal host response, prominent cytokine markers such as IL-1RA, IL-1α, IL-6, IL-8, IL-10 and TNF-α should show similar trends as SIRS criteria. METHODS: Re-examination of established IRB approved research data base for patients with vasopressor dependent septic shock with measured multiplex cytokine markers (Milliplex HCYTMAG-60K-PX29) as indicator of a dysregulated host response to infection in concern to SIRS criteria and newly established qSOFA score. RESULTS: For 174 patients data was re-examined. At the time of enrollment within 24 hours following shock onset the average qSOFA score was 1.47 and average SIRS criteria 2.06. Table 1 shows measured cytokine values in regard to number of either SIRS criteria or qSOFA score. The average values of measured cytokine markers increased with increasing number of SIRS criteria for IL-1RA, IL-1a, IL-6, IL-8, IL-10 and TNF-a. Similar cytokine increases in relation to the number of qSOFA score was only seen for IL-8, IL-10 and TNF-a.∗Average values shown as ng/mL (Table presented). CONCLUSION. In patients with vasopressor dependent shock increasing SIRS criteria are related to increases in measured circulating cytokine markers of inflammation. Circulating cytokine markers as indicator for an abnormal host response to infection do not increase with increasing qSOFA scores similar to the SIRS criteria and therefore may not be indicative of all aspects of this dysregulated host response

    Change in Lactate Levels After Hemodialysis in Patients With End-Stage Renal Disease

    No full text
    STUDY OBJECTIVE: Patients with end-stage renal disease commonly visit the emergency department (ED). The purpose of this investigation is to examine the prevalence of baseline abnormal lactate levels and to evaluate the effects of hemodialysis on serum lactate levels. METHODS: This was a prospective observational cohort study performed at an outpatient dialysis facility at an urban tertiary care hospital. The study consisted of 226 patients with end-stage renal disease who were receiving long-term hemodialysis and were enrolled during a 2-day period at the beginning of December 2015. Blood drawn for lactate levels was immediately analyzed before and after hemodialysis sessions. All patients completed their hemodialysis sessions. RESULTS: The prevalence of an abnormal lactate level (greater than 1.8 mmol/L) before hemodialysis was 17.7% (n=40). Overall, lactate levels decreased by 27% (SD 35%) after hemodialysis, with a decrease of 37% (SD 31%) for subgroups with a lactate level of 1.9 to 2.4 mmol/L, and 62% (SD 14%) with a lactate of 2.5 to 3.9 mmol/L. CONCLUSION: The data presented help providers understand the prevalence of abnormal lactate values in an outpatient end-stage renal disease population. After hemodialysis, lactate levels decreased significantly. This information may help medical providers interpret lactate values when patients with end-stage renal disease present to the ED

    Change in Lactate Levels After Hemodialysis in Patients With End-Stage Renal Disease

    No full text
    STUDY OBJECTIVE: Patients with end-stage renal disease commonly visit the emergency department (ED). The purpose of this investigation is to examine the prevalence of baseline abnormal lactate levels and to evaluate the effects of hemodialysis on serum lactate levels. METHODS: This was a prospective observational cohort study performed at an outpatient dialysis facility at an urban tertiary care hospital. The study consisted of 226 patients with end-stage renal disease who were receiving long-term hemodialysis and were enrolled during a 2-day period at the beginning of December 2015. Blood drawn for lactate levels was immediately analyzed before and after hemodialysis sessions. All patients completed their hemodialysis sessions. RESULTS: The prevalence of an abnormal lactate level (greater than 1.8 mmol/L) before hemodialysis was 17.7% (n=40). Overall, lactate levels decreased by 27% (SD 35%) after hemodialysis, with a decrease of 37% (SD 31%) for subgroups with a lactate level of 1.9 to 2.4 mmol/L, and 62% (SD 14%) with a lactate of 2.5 to 3.9 mmol/L. CONCLUSION: The data presented help providers understand the prevalence of abnormal lactate values in an outpatient end-stage renal disease population. After hemodialysis, lactate levels decreased significantly. This information may help medical providers interpret lactate values when patients with end-stage renal disease present to the ED

    Change in Lactate Levels After Hemodialysis in Patients With End-Stage Renal Disease.

    No full text
    STUDY OBJECTIVE: Patients with end-stage renal disease commonly visit the emergency department (ED). The purpose of this investigation is to examine the prevalence of baseline abnormal lactate levels and to evaluate the effects of hemodialysis on serum lactate levels. METHODS: This was a prospective observational cohort study performed at an outpatient dialysis facility at an urban tertiary care hospital. The study consisted of 226 patients with end-stage renal disease who were receiving long-term hemodialysis and were enrolled during a 2-day period at the beginning of December 2015. Blood drawn for lactate levels was immediately analyzed before and after hemodialysis sessions. All patients completed their hemodialysis sessions. RESULTS: The prevalence of an abnormal lactate level (greater than 1.8 mmol/L) before hemodialysis was 17.7% (n=40). Overall, lactate levels decreased by 27% (SD 35%) after hemodialysis, with a decrease of 37% (SD 31%) for subgroups with a lactate level of 1.9 to 2.4 mmol/L, and 62% (SD 14%) with a lactate of 2.5 to 3.9 mmol/L. CONCLUSION: The data presented help providers understand the prevalence of abnormal lactate values in an outpatient end-stage renal disease population. After hemodialysis, lactate levels decreased significantly. This information may help medical providers interpret lactate values when patients with end-stage renal disease present to the ED

    Early goal-directed therapy in severe sepsis and septic shock: insights and comparisons to ProCESS, ProMISe, and ARISE

    Get PDF
    Prior to 2001 there was no standard for early management of severe sepsis and septic shock in the emergency department. In the presence of standard or usual care, the prevailing mortality was over 40-50 %. In response, a systems-based approach, similar to that in acute myocardial infarction, stroke and trauma, called early goal-directed therapy was compared to standard care and this clinical trial resulted in a significant mortality reduction. Since the publication of that trial, similar outcome benefits have been reported in over 70 observational and randomized controlled studies comprising over 70,000 patients. As a result, early goal-directed therapy was largely incorporated into the first 6 hours of sepsis management (resuscitation bundle) adopted by the Surviving Sepsis Campaign and disseminated internationally as the standard of care for early sepsis management. Recently a trio of trials (ProCESS, ARISE, and ProMISe), while reporting an all-time low sepsis mortality, question the continued need for all of the elements of early goal-directed therapy or the need for protocolized care for patients with severe and septic shock. A review of the early hemodynamic pathogenesis, historical development, and definition of early goal-directed therapy, comparing trial conduction methodology and the changing landscape of sepsis mortality, are essential for an appropriate interpretation of these trials and their conclusions. Electronic supplementary material The online version of this article (doi:10.1186/s13054-016-1288-3) contains supplementary material, which is available to authorized users
    corecore