26 research outputs found

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p<0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p<0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Increase maize productivity and water use efficiency through application of potassium silicate under water stress

    No full text
    Abstract In Egypt, water shortage has become a key limiting factor for agriculture. Water-deficit stress causes different morphological, physiological, and biochemical impacts on plants. Two field experiments were carried out at Etay El-Baroud Station, El-Beheira Governorate, Agriculture Research Center (ARC), Egypt, to evaluate the effect of potassium silicate (K-silicate) of maize productivity and water use efficiency (WUE). A split-plot system in the four replications was used under three irrigation intervals during the 2017 and 2018 seasons. Whereas 10, 15, and 20 days irrigation intervals were allocated in main plots, while the three foliar application treatments of K-silicate (one spray at 40 days after sowing; two sprays at 40 and 60 days; and three sprays at 40, 60, and 80 days, and a control (water spray) were distributed in the subplots. All the treatments were distributed in 4 replicates. The results indicated that irrigation every 15 days gave the highest yield in both components and quality. The highly significant of (WUE) under irrigation every 20 days. Foliar spraying of K-silicate three times resulted in the highest yield. Even under water-deficit stress, irrigation every fifteen days combined with foliar application of K-silicate three times achieved the highest values of grain yield and its components. These results show that K-silicate treatment can increase WUE and produce high grain yield requiring less irrigation

    Soil Application of Nano Silica on Maize Yield and Its Insecticidal Activity Against Some Stored Insects After the Post-Harvest

    No full text
    Maize is considered one of the most imperative cereal crops worldwide. In this work, high throughput silica nanoparticles (SiO2-NPs) were prepared via the sol&ndash;gel technique. SiO2-NPs were attained in a powder form followed by full analysis using the advanced tools (UV-vis, HR-TEM, SEM, XRD and zeta potential). To this end, SiO2-NPs were applied as both nanofertilizer and pesticide against four common pests that infect the stored maize and cause severe damage to crops. As for nanofertilizers, the response of maize hybrid to mineral NPK, &ldquo;Nitrogen (N), Phosphorus (P), and Potassium (K)&rdquo; (0% = untreated, 50% of recommended dose and 100%), with different combinations of SiO2-NPs; (0, 2.5, 5, 10 g/kg soil) was evaluated. Afterward, post-harvest, grains were stored and fumigated with different concentrations of SiO2-NPs (0.0031, 0.0063. 0.25, 0.5, 1.0, 2.0, 2.5, 5, 10 g/kg) in order to identify LC50 and mortality % of four common insects, namely Sitophilus oryzae, Rhizopertha dominica, Tribolium castaneum, and Orizaephilus surinamenisis. The results revealed that, using the recommended dose of 100%, mineral NPK showed the greatest mean values of plant height, chlorophyll content, yield, its components, and protein (%). By feeding the soil with SiO2-NPs up to 10 g/kg, the best growth and yield enhancement of maize crop is noticed. Mineral NPK interacted with SiO2-NPs, whereas the application of mineral NPK at the rate of 50% with 10 g/kg SiO2-NPs, increased the highest mean values of agronomic characters. Therefore, SiO2-NPs can be applied as a growth promoter, and in the meantime, as strong unconventional pesticides for crops during storage, with a very small and safe dose

    Table_1_Determination of morpho-physiological and yield traits of maize inbred lines (Zea mays L.) under optimal and drought stress conditions.DOCX

    No full text
    Globally, climate change could hinder future food security that concurrently implies the importance of investigating drought stress and genotype screening under stressed environments. Hence, the current study was performed to screen 45 diverse maize inbred lines for 18 studied traits comprising phenological, physiological, morphological, and yield characters under optimum and water stress conditions for two successive growing seasons (2018 and 2019). The results showed that growing seasons and water regimes significantly influenced (p < 0.01) most of the studied traits, while inbred lines had a significant effect (p < 0.01) on all of the studied traits. The findings also showed a significant increase in all studied characters under normal conditions compared to drought conditions, except chlorophyll content, transpiration rate, and proline content which exhibited higher levels under water stress conditions. Furthermore, the results of the principal component analysis indicated a notable distinction between the performance of the 45 maize inbred lines under normal and drought conditions. In terms of grain yield, the drought tolerance index (DTI) showed that Nub60 (1.56), followed by Nub32 (1.46), Nub66 (1.45), and GZ603 (1.44) were the highest drought-tolerant inbred lines, whereas Nub46 (0.38) was the lowest drought-tolerant inbred line. These drought-tolerant inbred lines were able to maintain a relatively high grain yield under normal and stress conditions, whereas those drought-sensitive inbred lines showed a decline in grain yield when exposed to drought conditions. The hierarchical clustering analysis based on DTI classified the forty-five maize inbred lines and eighteen measured traits into three column- and row-clusters, as inbred lines in cluster-3 followed by those in cluster-2 exhibited greater drought tolerance in most of the studied traits. Utilizing the multi-trait stability index (MTSI) criterion in this study identified nine inbred lines, including GZ603, as stable genotypes in terms of the eighteen studied traits across four environments. The findings of the current investigation motivate plant breeders to explore the genetic potential of the current maize germplasm, especially in water-stressed environments.</p

    Table_3_Determination of morpho-physiological and yield traits of maize inbred lines (Zea mays L.) under optimal and drought stress conditions.XLSX

    No full text
    Globally, climate change could hinder future food security that concurrently implies the importance of investigating drought stress and genotype screening under stressed environments. Hence, the current study was performed to screen 45 diverse maize inbred lines for 18 studied traits comprising phenological, physiological, morphological, and yield characters under optimum and water stress conditions for two successive growing seasons (2018 and 2019). The results showed that growing seasons and water regimes significantly influenced (p < 0.01) most of the studied traits, while inbred lines had a significant effect (p < 0.01) on all of the studied traits. The findings also showed a significant increase in all studied characters under normal conditions compared to drought conditions, except chlorophyll content, transpiration rate, and proline content which exhibited higher levels under water stress conditions. Furthermore, the results of the principal component analysis indicated a notable distinction between the performance of the 45 maize inbred lines under normal and drought conditions. In terms of grain yield, the drought tolerance index (DTI) showed that Nub60 (1.56), followed by Nub32 (1.46), Nub66 (1.45), and GZ603 (1.44) were the highest drought-tolerant inbred lines, whereas Nub46 (0.38) was the lowest drought-tolerant inbred line. These drought-tolerant inbred lines were able to maintain a relatively high grain yield under normal and stress conditions, whereas those drought-sensitive inbred lines showed a decline in grain yield when exposed to drought conditions. The hierarchical clustering analysis based on DTI classified the forty-five maize inbred lines and eighteen measured traits into three column- and row-clusters, as inbred lines in cluster-3 followed by those in cluster-2 exhibited greater drought tolerance in most of the studied traits. Utilizing the multi-trait stability index (MTSI) criterion in this study identified nine inbred lines, including GZ603, as stable genotypes in terms of the eighteen studied traits across four environments. The findings of the current investigation motivate plant breeders to explore the genetic potential of the current maize germplasm, especially in water-stressed environments.</p
    corecore