498 research outputs found

    Interrater reliability of surveillance for ventilator-associated events and pneumonia

    Get PDF
    OBJECTIVETo compare interrater reliabilities for ventilator-associated event (VAE) surveillance, traditional ventilator-associated pneumonia (VAP) surveillance, and clinical diagnosis of VAP by intensivists.DESIGNA retrospective study nested within a prospective multicenter quality improvement study.SETTINGIntensive care units (ICUs) within 5 hospitals of the Centers for Disease Control and Prevention Epicenters.PATIENTSPatients who underwent mechanical ventilation.METHODSWe selected 150 charts for review, including all VAEs and traditionally defined VAPs identified during the primary study and randomly selected charts of patients without VAEs or VAPs. Each chart was independently reviewed by 2 research assistants (RAs) for VAEs, 2 hospital infection preventionists (IPs) for traditionally defined VAP, and 2 intensivists for any episodes of pulmonary deterioration. We calculated interrater agreement using κ estimates.RESULTSThe 150 selected episodes spanned 2,500 ventilator days. In total, 93–96 VAEs were identified by RAs; 31–49 VAPs were identified by IPs, and 29–35 VAPs were diagnosed by intensivists. Interrater reliability between RAs for VAEs was high (κ, 0.71; 95% CI, 0.59–0.81). Agreement between IPs using traditional VAP criteria was slight (κ, 0.12; 95% CI, −0.05–0.29). Agreement between intensivists was slight regarding episodes of pulmonary deterioration (κ 0.22; 95% CI, 0.05–0.39) and was fair regarding whether episodes of deterioration were attributable to clinically defined VAP (κ, 0.34; 95% CI, 0.17–0.51). The clinical correlation between VAE surveillance and intensivists’ clinical assessments was poor.CONCLUSIONSProspective surveillance using VAE criteria is more reliable than traditional VAP surveillance and clinical VAP diagnosis; the correlation between VAEs and clinically recognized pulmonary deterioration is poor.Infect Control Hosp Epidemiol 2017;38:172–178</jats:sec

    I’m going to fail! Acute cognitive performance anxiety increases threat-interference and impairs WM performance

    Get PDF
    Stress can impair cognitive performance, as commonly observed in cognitive performance anxiety (CPA; e.g., test anxiety). Cognitive theories indicate that stress impairs performance by increasing attention to negative thoughts, a phenomenon also known as threat-interference. These theories are mainly supported by findings related to self-report measures of threat-interference or trait anxiety. Our main aim was to test, for the first time in a single study, the hypotheses that acute CPA-related stress negatively affects both working memory (WM) performance and objectively assessed threat-interference during performance. In addition, we aimed to assess the validity of a new stress-induction procedure that was developed to induce acute CPA. Eighty-six females were randomly assigned to a CPA-related stress group (n = 45) or a control group. WM performance and threat-interference were assessed with an n-back task (2-back and 3-back memory loads), using CPA-related words as distracters. The stress group showed higher state anxiety and slower WM performance. Both effects were moderated by trait CPA: the effects were stronger for individuals with higher trait CPA. Finally, trait CPA moderated the effect of stress on threat-interference during higher cognitive load: individuals with higher trait CPA in the stress group showed higher threat-interference. We conclude that acute CPA increases threat-interference and impairs WM performance, especially in vulnerable individuals. The role of threat-interference, cognitive load, and trait anxiety should be taken into account in future research. Finally, our method (combining our stressor and modified n-back task) is effective for studying stress-cognition interactions in CPA.Stress and Psychopatholog

    The relative contribution of climate to changes in lesser prairie-chicken abundance

    Get PDF
    Citation: Ross, B. E., Haukos, D., Hagen, C., & Pitman, J. (2016). The relative contribution of climate to changes in lesser prairie-chicken abundance. Ecosphere, 7(6), 11. doi:10.1002/ecs2.1323Managing for species using current weather patterns fails to incorporate the uncertainty associated with future climatic conditions; without incorporating potential changes in climate into conservation strategies, management and conservation efforts may fall short or waste valuable resources. Understanding the effects of climate change on species in the Great Plains of North America is especially important, as this region is projected to experience an increased magnitude of climate change. Of particular ecological and conservation interest is the lesser prairie-chicken (Tympanuchus pallidicinctus), which was listed as "threatened" under the U.S. Endangered Species Act in May 2014. We used Bayesian hierarchical models to quantify the effects of extreme climatic events (extreme values of the Palmer Drought Severity Index [PDSI]) relative to intermediate (changes in El Nino Southern Oscillation) and long-term climate variability (changes in the Pacific Decadal Oscillation) on trends in lesser prairie-chicken abundance from 1981 to 2014. Our results indicate that lesser prairie-chicken abundance on leks responded to environmental conditions of the year previous by positively responding to wet springs (high PDSI) and negatively to years with hot, dry summers (low PDSI), but had little response to variation in the El Nino Southern Oscillation and the Pacific Decadal Oscillation. Additionally, greater variation in abundance on leks was explained by variation in site relative to broad-scale climatic indices. Consequently, lesser prairie-chicken abundance on leks in Kansas is more strongly influenced by extreme drought events during summer than other climatic conditions, which may have negative consequences for the population as drought conditions intensify throughout the Great Plains

    Fluoroquinolones Protective against Cephalosporin Resistance in Gram-negative Nosocomial Pathogens

    Get PDF
    In a matched case-control study, we studied the effect of prior receipt of fluoroquinolones on isolation of three third-generation cephalosporin-resistant gram-negative nosocomial pathogens. Two hundred eighty-two cases with a third-generation cephalosporin-resistant pathogen (203 with Enterobacter spp., 50 with Pseudomonas aeruginosa, and 29 with Klebsiella pneumoniae) were matched on length of stay to controls in a 1:2 ratio. Case-patients and controls were similar in age (mean 62 years) and sex (54% male). Variables predicting third-generation cephalosporin resistance were surgery (p = 0.005); intensive care unit stay (p < 0.001); and receipt of a β-lactam/β-lactamase inhibitor (p < 0.001), a ureidopenicillin (p = 0.002), or a third-generation cephalosporin (p < 0.001). Receipt of a fluoroquinolone was protective against isolation of a third-generation cephalosporin-resistant pathogen (p = 0.005). Interventional studies are required to determine whether replacing third-generation cephalosporins with fluoroquinolones will be effective in reducing cephalosporin resistance and the effect of such interventions on fluoroquinolone resistance

    Advancing diagnostics to address antibacterial resistance: The diagnostics and devices committee of the Antibacterial Resistance Leadership Group

    Get PDF
    Diagnostics are a cornerstone of the practice of infectious diseases. However, various limitations frequently lead to unmet clinical needs. In most other domains, diagnostics focus on narrowly defined questions, provide readily interpretable answers, and use true gold standards for development. In contrast, infectious diseases diagnostics must contend with scores of potential pathogens, dozens of clinical syndromes, emerging pathogens, rapid evolution of existing pathogens and their associated resistance mechanisms, and the absence of gold standards in many situations. In spite of these challenges, the importance and value of diagnostics cannot be underestimated. Therefore, the Antibacterial Resistance Leadership Group has identified diagnostics as 1 of 4 major areas of emphasis. Herein, we provide an overview of that development, highlighting several examples where innovation in study design, content, and execution is advancing the field of infectious diseases diagnostics

    Does vancomycin prescribing intervention affect vancomycin-resistant enterococcus infection and colonization in hospitals? A systematic review

    Get PDF
    BACKGROUND: Vancomycin resistant enterococcus (VRE) is a major cause of nosocomial infections in the United States and may be associated with greater morbidity, mortality, and healthcare costs than vancomycin-susceptible enterococcus. Current guidelines for the control of VRE include prudent use of vancomycin. While vancomycin exposure appears to be a risk factor for VRE acquisition in individual patients, the effect of vancomycin usage at the population level is not known. We conducted a systematic review to determine the impact of reducing vancomycin use through prescribing interventions on the prevalence and incidence of VRE colonization and infection in hospitals within the United States. METHODS: To identify relevant studies, we searched three electronic databases, and hand searched selected journals. Thirteen studies from 12 articles met our inclusion criteria. Data were extracted and summarized for study setting, design, patient characteristics, types of intervention(s), and outcome measures. The relative risk, 95% confidence interval, and p-value associated with change in VRE acquisition pre- and post-vancomycin prescription interventions were calculated and compared. Heterogeneity in study results was formally explored by stratified analysis. RESULTS: No randomized clinical trials on this topic were found. Each of the 13 included studies used a quasi-experimental design of low hierarchy. Seven of the 13 studies reported statistically significant reductions in VRE acquisition following interventions, three studies reported no significant change, and three studies reported increases in VRE acquisition, one of which reported statistical significance. Results ranged from a reduction of 82.5% to an increase of 475%. Studies of specific wards, which included sicker patients, were more likely to report positive results than studies of an entire hospital including general inpatients (Fisher's exact test 0.029). The type of intervention, endemicity status, type of study design, and the duration of intervention were not found to significantly modify the results. Among the six studies that implemented vancomycin reduction strategies as the sole intervention, two of six (33%) found a significant reduction in VRE colonization and/or infection. In contrast, among studies implementing additional VRE control measures, five of seven (71%) reported a significant reduction. CONCLUSION: It was not possible to conclusively determine a potential role for vancomycin usage reductions in controlling VRE colonization and infection in hospitals in the United States. The effectiveness of such interventions and their sustainability remains poorly defined because of the heterogeneity and quality of studies. Future research using high-quality study designs and implementing vancomycin as the sole intervention are needed to answer this question

    Unnecessary use of fluoroquinolone antibiotics in hospitalized patients

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Fluoroquinolones are among the most commonly prescribed antimicrobials and are an important risk factor for colonization and infection with fluoroquinolone-resistant gram-negative bacilli and for <it>Clostridium difficile </it>infection (CDI). In this study, our aim was to determine current patterns of inappropriate fluoroquinolone prescribing among hospitalized patients, and to test the hypothesis that longer than necessary treatment durations account for a significant proportion of unnecessary fluoroquinolone use.</p> <p>Methods</p> <p>We conducted a 6-week prospective, observational study to determine the frequency of, reasons for, and adverse effects associated with unnecessary fluoroquinolone use in a tertiary-care academic medical center. For randomly-selected adult inpatients receiving fluoroquinolones, therapy was determined to be necessary or unnecessary based on published guidelines or standard principles of infectious diseases. Adverse effects were determined based on chart review 6 weeks after completion of therapy.</p> <p>Results</p> <p>Of 1,773 days of fluoroquinolone therapy, 690 (39%) were deemed unnecessary. The most common reasons for unnecessary therapy included administration of antimicrobials for non-infectious or non-bacterial syndromes (292 days-of-therapy) and administration of antimicrobials for longer than necessary durations (234 days-of-therapy). The most common syndrome associated with unnecessary therapy was urinary tract infection or asymptomatic bacteriuria (30% of all unnecessary days-of-therapy). Twenty-seven percent (60/227) of regimens were associated with adverse effects possibly attributable to therapy, including gastrointestinal adverse effects (14% of regimens), colonization by resistant pathogens (8% of regimens), and CDI (4% of regimens).</p> <p>Conclusions</p> <p>In our institution, 39% of all days of fluoroquinolone therapy were unnecessary. Interventions that focus on improving adherence with current guidelines for duration of antimicrobial therapy and for management of urinary syndromes could significantly reduce overuse of fluoroquinolones.</p

    Why orchestral musicians are bound to wear earplugs: About the ineffectiveness of physical measures to reduce sound exposure

    Get PDF
    Symphony orchestra musicians are exposed to noise levels that put them at risk of developing hearing damage. This study evaluates the potential effectivity of common control measures used in orchestras on open stages with a typical symphonic setup. A validated acoustic prediction model is used that calculates binaural sound exposure levels at the ears of all musicians in the orchestra. The model calculates the equivalent sound levels for a performance of the first 2 min of the 4th movement of Mahler's 1st symphony, which can be considered representative for loud orchestral music. Calculated results indicate that risers, available space, and screens at typical positions do not significantly influence sound exposure. A hypothetical scenario with surround screens shows that, even when shielding all direct sound from others, sound exposure is reduced moderately with the largest effect on players in loud sections. In contrast, a dramatic change in room acoustic conditions only leads to considerable reductions for soft players. It can be concluded that significant reductions are only reached with extreme measures that are unrealistic. It seems impossible for the studied physical measures to be effective enough to replace hearing protection devices such as ear plugs
    • …
    corecore