19 research outputs found

    Rectal screening displays high negative predictive value for bloodstream infections with (ESBL-producing) gram negative bacteria in neonates with suspected sepsis in a low-resource setting neonatal care unit.

    Get PDF
    Objectives We analysed the concordance of rectal swab isolates and blood culture for Gram-negative bacteria (GNB) isolates in neonates with a suspicion of neonatal sepsis admitted to a neonatal care unit in Haiti. Methods We matched pairs of blood and rectal samples taken on the date of suspected sepsis onset in the same neonate. We calculated the proportion of rectal isolates in concordance with the blood isolates by species and genus. We calculated the negative predictive value (NPV) for GNB and extended-spectrum β-lactamase (ESBL)-producing GNB for all rectal and blood isolate pairs in neonates with suspected sepsis. Results We identified 238 blood and rectal samples pairs, with 238 blood isolate results and 309 rectal isolate results. The overall concordance in genus and species between blood and rectal isolates was 22.3% [95% confidence interval (CI) 17.4–28.0%] and 20.6% (95% CI 16.0–26.2%), respectively. The highest concordance between blood and rectal isolates was observed for samples with no bacterial growth (65%), followed byKlebsiella pneumoniae (18%) and Klebsiella oxytoca (12%). The NPV of detecting GNB bacterial isolates in rectal samples compared with those in blood samples was 81.6% and the NPV for ESBL-positive GNB was 92.6%. Conclusions The NPV of rectal swab GNB isolates was high in all patient groups and was even higher for ESBL-positive GNB. Clinicians can use the results from rectal swabs when taken simultaneously with blood samples during outbreaks to inform the (de-)escalation of antibiotic therapy in those neonates that have an ongoing sepsis profile

    Early warning for healthcare acquired infections in neonatal care units in a low-resource setting using routinely collected hospital data: The experience from Haiti, 2014–2018

    Get PDF
    In low-resource settings, detection of healthcare-acquired outbreaks in neonatal units relies on astute clinical staff to observe unusual morbidity or mortality from sepsis as microbiological diagnostics are often absent. We aimed to generate reliable (and automated) early warnings for potential clusters of neonatal late onset sepsis using retrospective data that could signal the start of an outbreak in an NCU in Port au Prince, Haiti, using routinely collected data on neonatal admissions. We constructed smoothed time series for late onset sepsis cases, late onset sepsis rates, neonatal care unit (NCU) mortality, maternal admissions, neonatal admissions and neonatal antibiotic consumption. An outbreak was defined as a statistical increase in any of these time series indicators. We created three outbreak alarm classes: 1) thresholds: weeks in which the late onset sepsis cases exceeded four, the late onset sepsis rates exceeded 10% of total NCU admissions and the NCU mortality exceeded 15%; 2) differential: late onset sepsis rates and NCU mortality were double the previous week; and 3) aberration: using the improved Farrington model for late onset sepsis rates and NCU mortality. We validated pairs of alarms by calculating the sensitivity and specificity of the weeks in which each alarm was launched and comparing each alarm to the weeks in which a single GNB positive blood culture was reported from a neonate. The threshold and aberration alarms were the strongest predictors for current and future NCU mortality and current LOS rates (p<0.0002). The aberration alarms were also those with the highest sensitivity, specificity, negative predictive value, and positive predictive value. Without microbiological diagnostics in NCUs in low-resource settings, applying these simple algorithms to routinely collected data show great potential to facilitate early warning for possible healthcare-acquired outbreaks of LOS in neonates. The methods used in this study require validation across other low-resource settings

    Multi-drug resistance and high mortality associated with community-acquired bloodstream infections in children in conflict-affected northwest Nigeria

    Get PDF
    Pediatric community-acquired bloodstream infections (CA-BSIs) in sub Saharan African humanitarian contexts are rarely documented. Effective treatment of these infections is additionally complicated by increasing rates of antimicrobial resistance. We describe the findings from epidemiological and microbiological surveillance implemented in pediatric patients with suspected CA-BSIs presenting for care at a secondary hospital in the conflict affected area of Zamfara state, Nigeria. Any child (> 2 months of age) presenting to Anka General Hospital from November 2018 to August 2020 with clinical severe sepsis at admission had clinical and epidemiological information and a blood culture collected at admission. Bacterial isolates were tested for antibiotic susceptibility. We calculated frequencies of epidemiological, microbiological and clinical parameters. We explored risk factors for death amongst severe sepsis cases using univariable and multivariable Poisson regression, adjusting for time between admission and hospital exit. We included 234 severe sepsis patients with 195 blood culture results. There were 39 positive blood cultures. Of the bacterial isolates, 14 were Gram positive and 18 were Gram negative; 5 were resistant to empiric antibiotics: methicillin-resistant Staphylococcus aureus (MRSA; n = 2) and Extended Spectrum Beta-Lactamase positive enterobacterales (n = 3). We identified no significant association between sex, age-group, ward, CA-BSI, appropriate intravenous antibiotic, malaria positivity at admission, suspected focus of sepsis, clinical severity and death in the multivariable regression. There is an urgent need for access to good clinical microbiological services, including point of care methods, and awareness and practice around rational antibiotic in healthcare staff in humanitarian settings to reduce morbidity and mortality from sepsis in children

    Probiotics [LGG-BB12 or RC14-GR1] versus placebo as prophylaxis for urinary tract infection in persons with spinal cord injury [ProSCIUTTU]: a randomised controlled trial

    Full text link
    © 2019, The Author(s). Study design: Randomised double-blind factorial-design placebo-controlled trial. Objective: Urinary tract infections (UTIs) are common in people with spinal cord injury (SCI). UTIs are increasingly difficult to treat due to emergence of multi-resistant organisms. Probiotics are efficacious in preventing UTIs in post-menopausal women. We aimed to determine whether probiotic therapy with Lactobacillus reuteri RC-14+Lactobacillus GR-1 (RC14-GR1) and/or Lactobacillus rhamnosus GG+Bifidobacterium BB-12 (LGG-BB12) are effective in preventing UTI in people with SCI. Setting: Spinal units in New South Wales, Australia with their rural affiliations. Methods: We recruited 207 eligible participants with SCI and stable neurogenic bladder management. They were randomised to one of four arms: RC14-GR1+LGG-BB12, RC14-GR1+placebo, LGG-BB12+ placebo or double placebos for 6 months. Randomisation was stratified by bladder management type and inpatient or outpatient status. The primary outcome was time to occurrence of symptomatic UTI. Results: Analysis was based on intention to treat. Participants randomised to RC14-GR1 had a similar risk of UTI as those not on RC14-GR1 (HR 0.67; 95% CI: 0.39–1.18; P = 0.17) after allowing for pre-specified covariates. Participants randomised to LGG-BB12 also had a similar risk of UTI as those not on LGG-BB12 (HR 1.29; 95% CI: 0.74–2.25; P = 0.37). Multivariable post hoc survival analysis for RC14-GR1 only vs. the other three groups showed a potential protective effect (HR 0.46; 95% CI: 0.21–0.99; P = 0.03), but this result would need to be confirmed before clinical application. Conclusion: In this RCT, there was no effect of RC14-GR1 or LGG-BB12 in preventing UTI in people with SCI

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Antimicrobial stewardship in primary health care programs in humanitarian settings: the time to act is now

    No full text
    Abstract Fragile and conflict-affected settings bear a disproportionate burden of antimicrobial resistance, due to the compounding effects of weak health policies, disrupted medical supply chains, and lack of knowledge and awareness about antibiotic stewardship both among health care providers and health service users. Until now, humanitarian organizations intervening in these contexts have confronted the threat of complex multidrug resistant infections mainly in their surgical projects at the secondary and tertiary levels of care, but there has been limited focus on ensuring the implementation of adequate antimicrobial stewardship in primary health care, which is known to be setting where the highest proportion of antibiotics are prescribed. In this paper, we present the experience of two humanitarian organizations, Médecins sans Frontières and the International Committee of the Red Cross, in responding to antimicrobial resistance in their medical interventions, and we draw from their experience to formulate practical recommendations to include antimicrobial stewardship among the standards of primary health care service delivery in conflict settings. We believe that expanding the focus of humanitarian interventions in unstable and fragile contexts to include antimicrobial stewardship in primary care will strengthen the global response to antimicrobial resistance and will decrease its burden where it is posing the highest toll in terms of mortality

    Outcomes of multisite antimicrobial stewardship programme implementation with a shared clinical decision support system

    Get PDF
    Background: Studies evaluating antimicrobial stewardship programmes (ASPs) supported by computerized clinical decision support systems (CDSSs) have predominantly been conducted in single site metropolitan hospitals. Objectives: To examine outcomes of multisite ASP implementation supported by a centrally deployed CDSS. Methods: An interrupted time series study was conducted across five hospitals in New South Wales, Australia, from 2010 to 2014. Outcomes analysed were: effect of the intervention on targeted antimicrobial use, antimicrobial costs and healthcare-associated Clostridium difficile infection (HCA-CDI) rates. Infection-related length of stay (LOS) and standardized mortality ratios (SMRs) were also assessed. Results: Post-intervention, antimicrobials targeted for increased use rose from 223 to 293 defined daily doses (DDDs)/1000 occupied bed days (OBDs)/month (+32%, P \u3c 0.01). Conversely, antimicrobials targeted for decreased use fell from 254 to 196 DDDs/1000 OBDs/month (−23%; P \u3c 0.01). These effects diminished over time. Antimicrobial costs decreased initially (−AUD64551/month;P3˘c0.01),thenincreased(+AUD64551/month; P \u3c 0.01), then increased (+AUD7273/month; P \u3c 0.01). HCA-CDI rates decreased post-intervention (−0.2 cases/10 000 OBDs/month; P \u3c 0.01). Proportional LOS reductions for key infections (respiratory from 4.8 to 4.3 days, P \u3c 0.01; septicaemia 6.8 to 6.1 days, P \u3c 0.01) were similar to background LOS reductions (2.1 to 1.9 days). Similarly, infection-related SMRs (observed/expected deaths) decreased (respiratory from 1.1 to 0.75; septicaemia 1.25 to 0.8; background rate 1.19 to 0.90. Conclusions: Implementation of a collaborative multisite ASP supported by a centrally deployed CDSS was associated with changes in targeted antimicrobial use, decreased antimicrobial costs, decreased HCA-CDI rates, and no observable increase in LOS or mortality. Ongoing targeted interventions are suggested to promote sustainability
    corecore