11 research outputs found

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Persistence and Leaching Potential of Microorganisms and Mineral N in Animal Manure Applied to Intact Soil Columns

    No full text
    Pathogens may reach agricultural soils through application of animal manure and thereby pose a risk of contaminating crops as well as surface and groundwater. Treatment and handling of manure for improved nutrient and odor management may also influence the amount and fate of manure-borne pathogens in the soil. A study was conducted to investigate the leaching potentials of a phage (Salmonella enterica serovar Typhimurium bacteriophage 28B) and two bacteria, Escherichia coli and Enterococcus species, in a liquid fraction of raw pig slurry obtained by solid-liquid separation of this slurry and in this liquid fraction after ozonation, when applied to intact soil columns by subsurface injection. We also compared leaching potentials of surface-applied and subsurface-injected raw slurry. The columns were exposed to irrigation events (3.5-h period at 10 mm h(−1)) after 1, 2, 3, and 4 weeks of incubation with collection of leachate. By the end of incubation, the distribution and survival of microorganisms in the soil of each treatment and in nonirrigated columns with injected raw slurry or liquid fraction were determined. E. coli in the leachates was quantified by both plate counts and quantitative PCR (qPCR) to assess the proportions of culturable and nonculturable (viable and nonviable) cells. Solid-liquid separation of slurry increased the redistribution in soil of contaminants in the liquid fraction compared to raw slurry, and the percent recovery of E. coli and Enterococcus species was higher for the liquid fraction than for raw slurry after the four leaching events. The liquid fraction also resulted in more leaching of all contaminants except Enterococcus species than did raw slurry. Ozonation reduced E. coli leaching only. Injection enhanced the leaching potential of the microorganisms investigated compared to surface application, probably because of a better survival with subsurface injection and a shorter leaching path

    A review of food security and the potentials to develop spatially informed food policies in Bangladesh

    No full text
    Background:Food security globally depends primarily on three components: food availability, food access, and food utilization. Regional variations of these components may affect food security via spatial differences in natural, social or economic conditions and the interaction of these in a complex environmental system.Purpose:It is important to understand the regional variation of food security, particularly where and under what natural and socio-economic circumstances people become vulnerable to low food security in a country.Methods:This article provides an overview of food security in Bangladesh in terms of the three main components, identifies knowledge gaps in present food security research, reviews possible impacts of climate change on food security, and sourced a wide range of spatio-temporal data relevant for food security.Results:The study highlights potentials and indicates different processes to develop spatially informed food policies in a country, particularly focuses on Bangladesh. This will contribute to improved food security by considering regional food security conditions, region-specific deficits, climate change, other future risks, and devises actions related to the respective components.Conclusion:The study concludes that different processes can provide a foundation for policy development and these will advance research-policy linkage to improved food security
    corecore