21 research outputs found

    Electrification of Sub-Saharan Africa through PV/hybrid mini-grids: Reducing the gap between current business models and on-site experience

    Get PDF
    The absence of publicly available up-to-date costs breakdown data on photovoltaic (PV)/hybrid mini-grids in Sub-Saharan Africa (SSA) is a barrier that needs to be resolved in order to overcome challenges in rural electrification planning, regulation, life-cycle operation, financing, and funding. The primary aim of this research is to provide better understanding of the cost structures of PV/hybrid mini-grid projects in Sub-Saharan Africa. The review on existing literature reveals significant lack of transparency and inconsistencies in PV/hybrid mini-grid costs. This paper aims to support the fact that there still remains a strong need to reduce the gap between current business model concepts and successfully implemented scale-up electrification models. Based on the experience of PV/hybrid mini-grids projects implemented in various rural communities of SSA, we propose a multi-dimensional cost analysis with a standardised break-down of the real costs of installed projects. Subsequently, we assess the main social and environmental implications and we identify barriers that appear to hinder successful PV mini-grid planning and subsequent implementation in SSA. Africa has the unique opportunity to utilize renewable energy as a primary energy source. Indeed, the continent has the potential to bring electricity especially to its rural population by means of PV/hybrid mini-grids. However, the capability of public and private sector investors to preevaluate projects is limited by the lack of locally available information on PV/hybrid mini-grid costs or the reliability of data (when available). Multi-dimensional cost analysis of social and environmental impacts from this study highlight that PV/hybrid mini-grids offer a unique opportunity to create a standardised framework for quantifying costs of PV/hybrid mini-grids in SSA, that can support decision-making processes for designing viable business models. Findings show that there is a strong need to minimise the data quality gap between current business model and that of successfully implemented PV/hybrid mini-grids electrification projects. This gap could be mitigated through studying the issues that influence mini-grid costs (both hardware and software). In addition to understanding other factors that can influence project costs such as the market maturity and remoteness of the site, organisation capability, development approach, and level of community involvement. Regarding policy considerations, stronger political will coupled with proactive rural electrification strategies and targeted renewable energy regulatory framework would be essential in order to establish viable dynamic domestic market for off grid renewables. In the presented benchmarking analysis, the experiences of public and private development organisations are synchronized to contribute to the furthest extent possible to facilitate the assessment. Those include the disaggregation of component costs according to their unit in order to make comparison more accurate and include site-specific parameters in the discussion of costs.JRC.C.2-Energy Efficiency and Renewable

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    A Mixed Methods Study of a Health Worker Training Intervention to Increase Syndromic Referral for Gambiense Human African Trypanosomiasis in South Sudan

    Get PDF
    BACKGROUND: Active screening by mobile teams is considered the most effective method for detecting gambiense-type human African trypanosomiasis (HAT) but constrained funding in many post-conflict countries limits this approach. Non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases for testing based on symptoms. We tested a training intervention for HCWs in peripheral facilities in Nimule, South Sudan to increase knowledge of HAT symptomatology and the rate of syndromic referrals to a central screening and treatment centre. METHODOLOGY/PRINCIPAL FINDINGS: We trained 108 HCWs from 61/74 of the public, private and military peripheral health facilities in the county during six one-day workshops and assessed behaviour change using quantitative and qualitative methods. In four months prior to training, only 2/562 people passively screened for HAT were referred from a peripheral HCW (0 cases detected) compared to 13/352 (2 cases detected) in the four months after, a 6.5-fold increase in the referral rate observed by the hospital. Modest increases in absolute referrals received, however, concealed higher levels of referral activity in the periphery. HCWs in 71.4% of facilities followed-up had made referrals, incorporating new and pre-existing ideas about HAT case detection into referral practice. HCW knowledge scores of HAT symptoms improved across all demographic sub-groups. Of 71 HAT referrals made, two-thirds were from new referrers. Only 11 patients completed the referral, largely because of difficulties patients in remote areas faced accessing transportation. CONCLUSIONS/SIGNIFICANCE: The training increased knowledge and this led to more widespread appropriate HAT referrals from a low base. Many referrals were not completed, however. Increasing access to screening and/or diagnostic tests in the periphery will be needed for greater impact on case-detection in this context. These data suggest it may be possible for peripheral HCWs to target the use of rapid diagnostic tests for HAT

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Managing Carbon

    Get PDF
    Storing carbon (C) and offsetting carbon dioxide (CO2) emissions with the use of wood for energy, both of which slow emissions of CO2 into the atmosphere, present significant challenges for forest management (IPCC 2001). In the United States, there has been a net increase in C in forests and in harvested wood products stocks (Tables 7.1 and 7.2), a result of historical and recent ecological conditions, management practices, and use of forest products (Birdsey et al. 2006). However, recent projections for the forest sector suggest that annual C storage could begin to decline, and U.S. forests could become a net C emitter of tens to hundreds of Tg C year ¹ within a few decades (USDA FS 2012a). It is therefore urgent to identify effective C management strategies, given the complexity of factors that drive the forest C cycle and the multiple objectives for which forests are managed. An ideal C management activity contributes benefits beyond increasing C storage by achieving other management objectives and providing ecosystem services in a sustainable manner. Strategies for effectively managing forest C stocks and offsetting C emissions requires a thorough understanding of biophysical and social influences on the forest C cycle (Birdsey et al. 1993). Successful policies and incentives may be chosen to support strategies if sufficient knowledge of social processes (e.g., landowner or wood-user response to incentives and markets) is available. For example, if C stocks are expected to decrease owing to decreasing forest land area caused by exurban development, policies or incentives to avoid deforestation in those areas may be effective. If C stocks are expected to decrease owing to the effects of a warmer climate, reducing stand densities may retain C over the long term by increasing resilience to drought and other stressors and by reducing crown fire hazard (Jackson et al. 2005; Reinhardt et al. 2008). Protecting old forests and other forests that have high C stocks may be more effective than seeking C offsets associated with wood use, especially if those forests would recover C more slowly in an altered climate. If climate change increases productivity in a given area over a long period of time, increasing forest C stocks through intensive management and forest products, including biomass energy, may be especially effective. It is equally important to know which strategies might make some management practices unacceptable (e.g., reducing biodiversity). However, no standard evaluation framework exists to aid decision making on alternative management strategies for maximizing C storage while minimizing risks and tradeoffs. Here we discuss (1) where forest C is stored in the United States, (2) how to measure forest C through space and time, (3) effectiveness of various management strategies in reducing atmospheric greenhouse gases (GHG), and (4) effectiveness of incentives, regulations, and institutional arrangements for implementing C management. Understanding of biophysical and social influences on the forest C cycle (Birdsey et al. 1993). Successful policies and incentives may be chosen to support strategies if sufficient knowledge of social processes (e.g., landowner or wood-user response to incentives and markets) is available. For example, if C stocks are expected to decrease owing to decreasing forest land area caused by exurban development, policies or incentives to avoid deforestation in those areas may be effective. If C stocks are expected to decrease owing to the effects of a warmer climate, reducing stand densities may retain C over the long term by increasing resilience to drought and other stressors and by reducing crown fire hazard (Jackson et al. 2005; Reinhardt et al. 2008). Protecting old forests and other forests that have high C stocks may be more effective than seeking C offsets associated with wood use, especially if those forests would recover C more slowly in an altered climate. If climate change increases productivity in a given area over a long period of time, increasing forest C stocks through intensive management and forest products, including biomass energy, may be especially effective. It is equally important to know which strategies might make some management practices unacceptable (e.g., reducing biodiversity). However, no standard evaluation framework exists to aid decision making on alternative management strategies for maximizing C storage while minimizing risks and tradeoffs. Here we discuss (1) where forest C is stored in the United States, (2) how to measure forest C through space and time, (3) effectiveness of various management strategies in reducing atmospheric greenhouse gases (GHG), and (4) effectiveness of incentives, regulations, and institutional arrangements for implementing C management
    corecore