24 research outputs found

    The Potential Influence of Common Viral Infections Diagnosed during Hospitalization among Critically Ill Patients in the United States

    Get PDF
    Viruses are the most common source of infection among immunocompetent individuals, yet they are not considered a clinically meaningful risk factor among the critically ill. This work examines the association of viral infections diagnosed during the hospital stay or not documented as present on admission to the outcomes of ICU patients with no evidence of immunosuppression on admission. This is a population-based retrospective cohort study of University HealthSystem Consortium (UHC) academic centers in the U.S. from the years 2006 to 2009. The UHC is an alliance of over 90% of the non-profit academic medical centers in the U.S. A total of 209,695 critically ill patients were used in this analysis. Eight hospital complications were examined. Patients were grouped into four cohorts: absence of infection, bacterial infection only, viral infection only, and bacterial and viral infection during same hospital admission. Viral infections diagnosed during hospitalization significantly increased the risk of all complications. There was also a seasonal pattern for viral infections. Specific viruses associated with poor outcomes included influenza, RSV, CMV, and HSV. Patients who had both viral and bacterial infections during the same hospitalization had the greatest risk of mortality RR 6.58, 95% CI (5.47, 7.91); multi-organ failure RR 8.25, 95% CI (7.50, 9.07); and septic shock RR 271.2, 95% CI (188.0, 391.3). Viral infections may play a significant yet unrecognized role in the outcomes of ICU patients. They may serve as biological markers or play an active role in the development of certain adverse complications by interacting with coincident bacterial infection

    Ecological character displacement in the face of gene flow: Evidence from two species of nightingales

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ecological character displacement is a process of phenotypic differentiation of sympatric populations caused by interspecific competition. Such differentiation could facilitate speciation by enhancing reproductive isolation between incipient species, although empirical evidence for it at early stages of divergence when gene flow still occurs between the species is relatively scarce. Here we studied patterns of morphological variation in sympatric and allopatric populations of two hybridizing species of birds, the Common Nightingale (<it>Luscinia megarhynchos</it>) and the Thrush Nightingale (<it>L. luscinia</it>).</p> <p>Results</p> <p>We conducted principal component (PC) analysis of morphological traits and found that nightingale species converged in overall body size (PC1) and diverged in relative bill size (PC3) in sympatry. Closer analysis of morphological variation along geographical gradients revealed that the convergence in body size can be attributed largely to increasing body size with increasing latitude, a phenomenon known as Bergmann's rule. In contrast, interspecific interactions contributed significantly to the observed divergence in relative bill size, even after controlling for the effects of geographical gradients. We suggest that the divergence in bill size most likely reflects segregation of feeding niches between the species in sympatry.</p> <p>Conclusions</p> <p>Our results suggest that interspecific competition for food resources can drive species divergence even in the face of ongoing hybridization. Such divergence may enhance reproductive isolation between the species and thus contribute to speciation.</p

    Outcomes in COVID-19 patients undergoing laparoscopic cholecystectomy/appendectomy in the pre-vaccine era

    No full text
    Background: We hypothesized that COVID-19 positive patients requiring laparoscopic cholecystectomy (lap chole) or appendectomy (lap appy) would have increased inpatient mortality rates compared to all COVID-19 patients. Methods: Retrospective cohort analysis including COVID-19 patients from 1/1/20 to 9/30/20. 82,574 cases identified. Patients excluded if <18 years old or underwent surgery other than lap chole or lap appy. Control groups were patients without surgery (N = 82,145). Exposure groups underwent lap chole (N = 323) or lap appy (N = 106). Primary outcome was inpatient mortality. Secondary outcomes included hospital length of stay (LOS) and complications such as bacterial pneumonia, deep venous thrombosis (DVT), pulmonary embolism (PE), urinary tract infection (UTI), acute myocardial infarction (MI), acute respiratory distress syndrome (ARDS), and respiratory failure (RF). Results: Overall inpatient mortality rate was 32.8% in COVID-19 patients undergoing lap chole (p-value <0.0001), 2.8 % lap appy (p-value 0.93), and 1.2 % in control group. ARDS complication rate was 11.2 % in lap chole (p-value <0.0001), 1.9 % lap appy (p-value 0.71), and 0.2 % in control. Conclusion: COVID-19 patients during the initial wave of the pandemic who underwent lap chole during hospital admission had significantly higher risk of mortality and ARDS while lap appy did not

    Successful Implementation of a Packed Red Blood Cell and Fresh Frozen Plasma Transfusion Protocol in the Surgical Intensive Care Unit

    Get PDF
    <div><p>Background</p><p>Blood product transfusions are associated with increased morbidity and mortality. The purpose of this study was to determine if implementation of a restrictive protocol for packed red blood cell (PRBC) and fresh frozen plasma (FFP) transfusion safely reduces blood product utilization and costs in a surgical intensive care unit (SICU).</p><p>Study Design</p><p>We performed a retrospective, historical control analysis comparing before (PRE) and after (POST) implementation of a restrictive PRBC/FFP transfusion protocol for SICU patients. Univariate analysis was utilized to compare patient demographics and blood product transfusion totals between the PRE and POST cohorts. Multivariate logistic regression models were developed to determine if implementation of the restrictive transfusion protocol is an independent predictor of adverse outcomes after controlling for age, illness severity, and total blood products received.</p><p>Results</p><p>829 total patients were included in the analysis (PRE, n=372; POST, n=457). Despite higher mean age (56 vs. 52 years, p=0.01) and APACHE II scores (12.5 vs. 11.2, p=0.006), mean units transfused per patient were lower for both packed red blood cells (0.7 vs. 1.2, p=0.03) and fresh frozen plasma (0.3 vs. 1.2, p=0.007) in the POST compared to the PRE cohort, respectively. There was no difference in inpatient mortality between the PRE and POST cohorts (7.5% vs. 9.2%, p=0.39). There was a decreased risk of urinary tract infections (OR 0.47, 95%CI 0.28-0.80) in the POST cohort after controlling for age, illness severity and amount of blood products transfused.</p><p>Conclusions</p><p>Implementation of a restrictive transfusion protocol can effectively reduce blood product utilization in critically ill surgical patients with no increase in morbidity or mortality.</p></div

    The Need for Improved Identification and Accurate Classification of Stages 3–5 Chronic Kidney Disease in Primary Care: Retrospective Cohort Study

    Get PDF
    Background Around ten percent of the population have been reported as having Chronic Kidney Disease (CKD), which is associated with increased cardiovascular mortality. Few previous studies have ascertained the chronicity of CKD. In the UK, a payment for performance (P4P) initiative incentivizes CKD (stages 3–5) recognition and management in primary care, but the impact of this has not been assessed. Methods and Findings Using data from 426 primary care practices (population 2,707,130), the age standardised prevalence of stages 3–5 CKD was identified using two consecutive estimated Glomerular Filtration Rates (eGFRs) seven days apart. Additionally the accuracy of practice CKD registers and the relationship between accurate identification of CKD and the achievement of P4P indicators was determined. Between 2005 and 2009, the prevalence of stages 3–5 CKD increased from 0.3% to 3.9%. In 2009, 30,440 patients (1.1% unadjusted) fulfilled biochemical criteria for CKD but were not on a practice CKD register (uncoded CKD) and 60,705 patients (2.2% unadjusted) were included on a practice CKD register but did not fulfil biochemical criteria (miscoded CKD). For patients with confirmed CKD, inclusion in a practice register was associated with increasing age, male sex, diabetes, hypertension, cardiovascular disease and increasing CKD stage (p<0.0001). Uncoded CKD patients compared to miscoded patients were less likely to achieve performance indicators for blood pressure (OR 0.84, 95% CI 0.82–0.86 p<0.001) or recorded albumin-creatinine ratio (OR 0.73, 0.70–0.76, p<0.001). Conclusions The prevalence of stages 3–5 CKD, using two laboratory reported eGFRs, was lower than estimates from previous studies. Clinically significant discrepancies were identified between biochemically defined CKD and appearance on practice registers, with misclassification associated with sub-optimal care for some people with CKD

    Multivariate Analysis of early transfusion and urinary tract infection rates.

    No full text
    <p>Abbreviations. RR, relative risk; CI, confidence interval.</p><p>Abbreviations. APACHE, Acute Physiology and Chronic Health Evaluation; PRBC, packed red blood cells; FFP, fresh frozen plasma; CI, confidence interval.</p><p>Multivariate Analysis of early transfusion and urinary tract infection rates.</p
    corecore