13 research outputs found

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Longitudinal Shedding Patterns and Characterization of Antibiotic Resistant E. coli in Pastured Goats Using a Cohort Study

    No full text
    There is a scarcity of information on antibiotic resistance in goats. To understand shedding of resistant Escherichia coli in pastured goats, we collected fecal samples from a mixed age cohort over a one-year period. No antibiotic had been used on the study animals one year prior to and during the study period. Resistant isolates were detected in all age groups and prevalence in goat kids was significantly higher than adults; 43&ndash;48% vs. 8&ndash;25% respectively. The proportion of resistant isolates was higher when animals were congregated near handling facility than on pasture. Most isolates were resistant to tetracycline (51%) and streptomycin (30%), but also to antibiotics that had never been used on the farm; ampicillin (19%). TetB, bla-TEM, (aadA and strpA/strpB) genes were detected in 70%, 43%, (44% and 24%) of tetracycline, ampicillin, and streptomycin resistant isolates respectively. Resistant isolates also harbored virulent genes and some belonged to D and B2 phylogenetic groups. Thus, pastured goats, despite minimal exposure to antibiotics, are reservoirs of resistant E. coli that may contaminate the environment and food chain and spread resistant genes to pathogenic bacteria and some that are potential animal and human pathogens. Environmental sources may play a role in acquisition of resistant bacteria in pastured goats

    Shiga Toxin Subtypes, Serogroups, Phylogroups, RAPD Genotypic Diversity, and Select Virulence Markers of Shiga-Toxigenic Escherichia coli Strains from Goats in Mid-Atlantic US

    No full text
    Understanding Shiga toxin subtypes in E. coli from reservoir hosts may give insight into their significance as human pathogens. The data also serve as an epidemiological tool for source tracking. We characterized Shiga toxin subtypes in 491 goat E. coli isolates (STEC) from the mid-Atlantic US region (stx1 = 278, stx2 = 213, and stx1/stx2 = 95). Their serogroups, phylogroups, M13RAPD genotypes, eae (intimin), and hly (hemolysin) genes were also evaluated. STEC-positive for stx1 harbored Stx1c (79%), stx1a (21%), and stx a/c (4%). Those positive for Stx2 harbored stx2a (55%) and Stx2b (32%), while stx2a/stx2d and stx2a/stx2b were each 2%. Among the 343 STEC that were serogrouped, 46% (n = 158) belonged to O8, 20% (n = 67) to 076, 12% (n = 42) to O91, 5% (n = 17) to O5, and 5% (n = 18) to O26. Less than 5% belonged to O78, O87, O146, and O103. The hly and eae genes were detected in 48% and 14% of STEC, respectively. Most belonged to phylogroup B1 (73%), followed by D (10%), E (8%), A (4%), B2 (4%), and F (1%). M13RAPD genotyping revealed clonality of 091, O5, O87, O103, and O78 but higher diversity in the O8, O76, and O26 serogroups. These results indicate goat STEC belonged to important non-O157 STEC serogroups, were genomically diverse, and harbored Shiga toxin subtypes associated with severe human disease

    Influence of prior pH and thermal stresses on thermal tolerance of foodborne pathogens

    No full text
    Improper food processing is one of the major causes of foodborne illness. Accurate prediction of the thermal destruction rate of foodborne pathogens is therefore vital to ensure proper processing and food safety. When bacteria are subjected to pH and thermal stresses during growth, sublethal stresses can occur that may lead to differences in their subsequent tolerance to thermal treatment. As a preliminary study to test this concept, the current study evaluated the effect of prior pH and thermal stresses on thermal tolerance of Salmonella and Staphylococcus using a tryptic soy broth supplemented with yeast extract. Bacteria incubated at three pH values (6.0, 7.4, and 9.0) and four temperatures (15, 25, 35, and 45°C) for 24 hr were subjected to thermal treatments at 55, 60, and 65°C. At the end of each treatment time, bacterial suspensions were surface‐plated on standard method agar for quantification of bacterial survival and further calculation of the thermal death decimal reduction time (D‐value) and thermal destruction temperature (z‐value). The effect of pH stress alone during the incubation on the thermal tolerance of both bacteria was generally insignificant. An increasing pattern of D‐value was observed with the increment of thermal stress (incubation temperature). The bacteria incubated at 35°C required the highest z‐value to reduce the 90% in D‐values. Staphylococcus mostly displayed higher tolerance to thermal treatment than Salmonella. Although further research is needed to validate the current findings on food matrices, findings in this study clearly affirm that adaptation of bacteria to certain stresses may reduce the effectiveness of preservation procedures applied during later stage of food processing and storage

    Influence of growth temperature on thermal tolerance of leading foodborne pathogens

    No full text
    Accurate prediction of the thermal destruction rate of foodborne pathogens is important for food processors to ensure proper food safety. When bacteria are subjected to thermal stress during storage, sublethal stresses and/or thermal acclimation may lead to differences in their subsequent tolerance to thermal treatment. The aim of the current study was to evaluate the thermal tolerance of Escherichia coli O157:H7, Listeria monocytogenes, Salmonella enterica, and Staphylococcus aureus that are incubated during overnight growth in tryptic soy broth at four temperatures (15, 25, 35, and 45°C). Following incubation, the bacteria were subjected to thermal treatments at 55, 60, and 65°C. At the end of each treatment time, bacterial survival was quantified and further calculated for the thermal death decimal reduction time (D-value) and thermal destruction temperature (z-value) using a linear model for thermal treatment time (min) vs. microbial population (Log CFU/ml) and thermal treatment temperature (°C) vs. D-value, respectively, for each bacterium. Among the four bacterial species, E. coli generally had longer D-values and lower z-values than did other bacteria. Increasing patterns of D- and z-values in Listeria were obtained with the increment of incubation temperatures from 15 to 45°C. The z-values of Staphylococcus (6.19°C), Salmonella (6.73°C), Listeria (7.10°C), and Listeria (7.26°C) were the highest at 15, 25, 35, and 45°C, respectively. Although further research is needed to validate the findings on food matrix, findings in this study clearly affirm that adaptation of bacteria to certain stresses may reduce the effectiveness of preservation hurdles applied during later stages of food processing and storage

    Examining Antimicrobial Resistance in <i>Escherichia coli</i>: A Case Study in Central Virginia’s Environment

    No full text
    While environmental factors may contribute to antimicrobial resistance (AMR) in bacteria, many aspects of environmental antibiotic pollution and resistance remain unknown. Furthermore, the level of AMR in Escherichia coli is considered a reliable indicator of the selection pressure exerted by antimicrobial use in the environment. This study aimed to assess AMR variance in E. coli isolated from diverse environmental samples, such as animal feces and water from wastewater treatment plants (WWTPs) and drainage areas of different land use systems in Central Virginia. In total, 450 E. coli isolates obtained between August 2020 and February 2021 were subjected to susceptibility testing against 12 antimicrobial agents approved for clinical use by the U.S. Food and Drug Administration. Approximately 87.8% of the tested isolates were resistant to at least one antimicrobial agent, with 3.1% showing multi-drug resistance. Streptomycin resistance was the most common (73.1%), while susceptibility to chloramphenicol was the highest (97.6%). One isolate obtained from WWTPs exhibited resistance to seven antimicrobials. AMR prevalence was the highest in WWTP isolates, followed by isolates from drainage areas, wild avians, and livestock. Among livestock, horses had the highest AMR prevalence, while cattle had the lowest. No significant AMR difference was found across land use systems. This study identifies potential AMR hotspots, emphasizing the environmental risk for antimicrobial resistant E. coli. The findings will aid policymakers and researchers, highlighting knowledge gaps in AMR–environment links. This nationally relevant research offers a scalable AMR model for understanding E. coli ecology. Further large-scale research is crucial to confirm the environmental impacts on AMR prevalence in bacteria

    Evaluation of the point-of-care Becton Dickinson Veritor\u2122 Rapid influenza diagnostic test in Kenya, 2013\u20132014

    No full text
    Abstract Background We evaluated the performance of the Becton Dickinson Veritor\u2122 System Flu A + B rapid influenza diagnostic test (RIDT) to detect influenza viruses in respiratory specimens from patients enrolled at five surveillance sites in Kenya, a tropical country where influenza seasonality is variable. Methods Nasal swab (NS) and nasopharyngeal (NP)/oropharyngeal (OP) swabs were collected from patients with influenza like illness and/or severe acute respiratory infection. The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of the RIDT using NS specimens were evaluated against nasal swabs tested by real time reverse transcription polymerase chain reaction (rRT-PCR). The performance parameter results were expressed as 95% confidence intervals (CI) calculated using binomial exact methods, with P < 0.05 considered significant. Two-sample Z tests were used to test for differences in sample proportions. Analysis was performed using SAS software version 9.3. Results From July 2013 to July 2014, 3,569 patients were recruited, of which 78.7% were aged <5 years. Overall, 14.4% of NS specimens were influenza-positive by RIDT. RIDT overall sensitivity was 77.1% (95% CI 72.8\u201381.0%) and specificity was 94.9% (95% CI 94.0\u201395.7%) compared to rRT-PCR using NS specimens. RIDT sensitivity for influenza A virus compared to rRT-PCR using NS specimens was 71.8% (95% CI 66.7\u201376.4%) and was significantly higher than for influenza B which was 43.8% (95% CI 33.8\u201354.2%). PPV ranged from 30%\u201380% depending on background prevalence of influenza. Conclusion Although the variable seasonality of influenza in tropical Africa presents unique challenges, RIDTs may have a role in making influenza surveillance sustainable in more remote areas of Africa, where laboratory capacity is limited
    corecore