9 research outputs found

    Rapid identification of sepsis in the emergency department

    Get PDF
    Abstract Objectives Recent research has helped define the complex pathways in sepsis, affording new opportunities for advancing diagnostics tests. Given significant advances in the field, a group of academic investigators from emergency medicine, intensive care, pathology, and pharmacology assembled to develop consensus around key gaps and potential future use for emerging rapid host response diagnostics assays in the emergency department (ED) setting. Methods A modified Delphi study was conducted that included 26 panelists (expert consensus panel) from multiple specialties. A smaller steering committee first defined a list of Delphi statements related to the need for and future potential use of a hypothetical sepsis diagnostic test in the ED. Likert scoring was used to assess panelists agreement or disagreement with statements. Two successive rounds of surveys were conducted and consensus for statements was operationally defined as achieving agreement or disagreement of 75% or greater. Results Significant gaps were identified related to current tools for assessing risk of sepsis in the ED. Strong consensus indicated the need for a test providing an indication of the severity of dysregulated host immune response, which would be helpful even if it did not identify the specific pathogen. Although there was a relatively high degree of uncertainty regarding which patients would most benefit from the test, the panel agreed that an ideal host response sepsis test should aim to be integrated into ED triage and thus should produce results in less than 30 minutes. The panel also agreed that such a test would be most valuable for improving sepsis outcomes and reducing rates of unnecessary antibiotic use. Conclusion The expert consensus panel expressed strong consensus regarding gaps in sepsis diagnostics in the ED and the potential for new rapid host response tests to help fill these gaps. These finding provide a baseline framework for assessing key attributes of evolving host response diagnostic tests for sepsis in the ED

    Using palm-mat geotextiles on an arable soil for water erosion control in the UK

    No full text
    To date, most studies of the effectiveness of geotextiles on soil erosion rates and processes have been conducted in laboratory experiments for less than 1 h. Hence, at Hilton (52°33′ N, 2°19′ W), UK, the effectiveness of employing palm-mat geotextiles for soil erosion control under field conditions on arable loamy sands was investigated. Geotextile-mats constructed from Borassus aethiopum (Borassus palm ofWest Africa) and Mauritia flexuosa (Buriti palm of South America) leaves are termed Borassus mats and Buriti mats, respectively. Duplicate runoff plots (10 m ¥ 1 m on a 15° slope) had five treatments (bare, permanent grass, Borassus total plot cover, Borassus buffer strip and Buriti buffer strip). Borassus covered plots had about 72% ground cover and to differentiate between this treatment and Borassus buffer strips, the former treatment is termed Borassus completely-covered. Runoff and eroded soil were collected from each bounded plot in a concrete gutter, leading to a receptacle. Results from 08/01/2007 – 23/01/2009 (total precipitation = 1776·5 mm; n = 53 time intervals) show that using Borassus buffer strips (area coverage ~10%) on bare soil decreased runoff volume by about 71% (P > 0·05) and soil erosion by 92% (P < 0·001). Bare plots had nearly 29·1 L m-2 runoff and 2·36 kg m-2 soil erosion during that period. Borassus buffer strip, Buriti buffer strip and Borassus completely-covered plots had similar effects in decreasing runoff volume and soil erosion. Runoff volumes largely explain the variability in soil erosion rates. Although buffer strips of Borassus mats were as effective as whole plot cover of the same mats, the longevity of Borassus mats was nearly twice that of Buriti mats. Thus, use of Borassus mats as buffer strips on bare plots is highly effective for soil erosion control. The mechanisms explaining the effectiveness of buffer strips require further studies under varied pedo-climatic conditions

    Quantifying the effects of prior acetyl-salicylic acid on sepsis-related deaths: An individual patient data meta-analysis using propensity matching

    No full text
    Objective: The primary objective was to conduct a meta-analysis on published observational cohort data describing the association between acetyl-salicylic acid (aspirin) use prior to the onset of sepsis and mortality in hospitalized patients. Study Selection: Studies that reported mortality in patients on aspirin with sepsis with a comparison group of patients with sepsis not on prior aspirin therapy were included. Data Sources: Fifteen studies described hospital-based cohorts (n = 17,065), whereas one was a large insurance-based database (n = 683,421). Individual-level patient data were incorporated from all selected studies. Data Extraction: Propensity analyses with 1:1 propensity score matching at the study level were performed, using the most consistently available covariates judged to be associated with aspirin. Meta-analyses were performed to estimate the pooled average treatment effect of aspirin on sepsis-related mortality. Data Synthesis: Use of aspirin was associated with a 7% (95% CI, 2-12%; p = 0.005) reduction in the risk of death as shown by meta-analysis with considerable statistical heterogeneity (I-2 = 61.6%). Conclusions: These results are consistent with effects ranging from a 2% to 12% reduction in mortality risk in patients taking aspirin prior to sepsis onset. This association anticipates results of definitive studies of the use of low-dose aspirin as a strategy for reduction of deaths in patients with sepsi

    Prevalence and risk factors for delirium in critically ill patients with COVID-19 (COVID-D): a multicentre cohort study

    Get PDF
    Background: To date, 750 000 patients with COVID-19 worldwide have required mechanical ventilation and thus are at high risk of acute brain dysfunction (coma and delirium). We aimed to investigate the prevalence of delirium and coma, and risk factors for delirium in critically ill patients with COVID-19, to aid the development of strategies to mitigate delirium and associated sequelae. Methods: This multicentre cohort study included 69 adult intensive care units (ICUs), across 14 countries. We included all patients (aged ≥18 years) admitted to participating ICUs with severe acute respiratory syndrome coronavirus 2 infection before April 28, 2020. Patients who were moribund or had life-support measures withdrawn within 24 h of ICU admission, prisoners, patients with pre-existing mental illness, neurodegenerative disorders, congenital or acquired brain damage, hepatic coma, drug overdose, suicide attempt, or those who were blind or deaf were excluded. We collected de-identified data from electronic health records on patient demographics, delirium and coma assessments, and management strategies for a 21-day period. Additional data on ventilator support, ICU length of stay, and vital status was collected for a 28-day period. The primary outcome was to determine the prevalence of delirium and coma and to investigate any associated risk factors associated with development of delirium the next day. We also investigated predictors of number of days alive without delirium or coma. These outcomes were investigated using multivariable regression. Findings: Between Jan 20 and April 28, 2020, 4530 patients with COVID-19 were admitted to 69 ICUs, of whom 2088 patients were included in the study cohort. The median age of patients was 64 years (IQR 54 to 71) with a median Simplified Acute Physiology Score (SAPS) II of 40·0 (30·0 to 53·0). 1397 (66·9%) of 2088 patients were invasively mechanically ventilated on the day of ICU admission and 1827 (87·5%) were invasively mechanical ventilated at some point during hospitalisation. Infusion with sedatives while on mechanical ventilation was common: 1337 (64·0%) of 2088 patients were given benzodiazepines for a median of 7·0 days (4·0 to 12·0) and 1481 (70·9%) were given propofol for a median of 7·0 days (4·0 to 11·0). Median Richmond Agitation–Sedation Scale score while on invasive mechanical ventilation was –4 (–5 to –3). 1704 (81·6%) of 2088 patients were comatose for a median of 10·0 days (6·0 to 15·0) and 1147 (54·9%) were delirious for a median of 3·0 days (2·0 to 6·0). Mechanical ventilation, use of restraints, and benzodiazepine, opioid, and vasopressor infusions, and antipsychotics were each associated with a higher risk of delirium the next day (all p≤0·04), whereas family visitation (in person or virtual) was associated with a lower risk of delirium (p<0·0001). During the 21-day study period, patients were alive without delirium or coma for a median of 5·0 days (0·0 to 14·0). At baseline, older age, higher SAPS II scores, male sex, smoking or alcohol abuse, use of vasopressors on day 1, and invasive mechanical ventilation on day 1 were independently associated with fewer days alive and free of delirium and coma (all p<0·01). 601 (28·8%) of 2088 patients died within 28 days of admission, with most of those deaths occurring in the ICU. Interpretation: Acute brain dysfunction was highly prevalent and prolonged in critically ill patients with COVID-19. Benzodiazepine use and lack of family visitation were identified as modifiable risk factors for delirium, and thus these data present an opportunity to reduce acute brain dysfunction in patients with COVID-19. Funding: None. Translations: For the French and Spanish translations of the abstract see Supplementary Materials section

    “The Great Djuna:” Two Decades of Barnes Studies, 1993-2013

    No full text

    Prevalence and risk factors for delirium in critically ill patients with COVID-19 (COVID-D): a multicentre cohort study

    No full text

    Rationale and Design for a GRADE Substudy of Continuous Glucose Monitoring

    No full text
    corecore