16 research outputs found

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Mortality of emergency abdominal surgery in high-, middle- and low-income countries

    Get PDF
    Background: Surgical mortality data are collected routinely in high-income countries, yet virtually no low- or middle-income countries have outcome surveillance in place. The aim was prospectively to collect worldwide mortality data following emergency abdominal surgery, comparing findings across countries with a low, middle or high Human Development Index (HDI). Methods: This was a prospective, multicentre, cohort study. Self-selected hospitals performing emergency surgery submitted prespecified data for consecutive patients from at least one 2-week interval during July to December 2014. Postoperative mortality was analysed by hierarchical multivariable logistic regression. Results: Data were obtained for 10 745 patients from 357 centres in 58 countries; 6538 were from high-, 2889 from middle- and 1318 from low-HDI settings. The overall mortality rate was 1⋅6 per cent at 24 h (high 1⋅1 per cent, middle 1⋅9 per cent, low 3⋅4 per cent; P < 0⋅001), increasing to 5⋅4 per cent by 30 days (high 4⋅5 per cent, middle 6⋅0 per cent, low 8⋅6 per cent; P < 0⋅001). Of the 578 patients who died, 404 (69⋅9 per cent) did so between 24 h and 30 days following surgery (high 74⋅2 per cent, middle 68⋅8 per cent, low 60⋅5 per cent). After adjustment, 30-day mortality remained higher in middle-income (odds ratio (OR) 2⋅78, 95 per cent c.i. 1⋅84 to 4⋅20) and low-income (OR 2⋅97, 1⋅84 to 4⋅81) countries. Surgical safety checklist use was less frequent in low- and middle-income countries, but when used was associated with reduced mortality at 30 days. Conclusion: Mortality is three times higher in low- compared with high-HDI countries even when adjusted for prognostic factors. Patient safety factors may have an important role. Registration number: NCT02179112 (http://www.clinicaltrials.gov)

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Nurses' perceptions of aids and obstacles to the provision of optimal end of life care in ICU

    Get PDF
    Contains fulltext : 172380.pdf (publisher's version ) (Open Access

    Seed Quality and Protein Classification of Some Quinoa Varieties

    No full text
    Quinoa plants, originating from the Andean mountains in South America, have a large scale of biological diversity. Along with the cultivation favorableness of quinoa, it reveals superior nutrition aspects. In comparison with cereal crops, like rice, maize, and wheat, quinoa seeds contain valuable quantities of protein of remarkable quality. The current study compared four quinoa cultivars from different origins in terms of protein composition and germinability. In addition, this study focused on the effect of different geographical cultivation areas on the protein composition of wild Egyptian quinoa seeds and three other cultivars that vary in their cultivation origins. Significant differences were observed among the quinoa varieties in the germination percentage (GP), shoot length (SL), and root length (RL). Using the technology of Near-InfraRed Spectroscopy, the highest protein value was recorded for the American variety (18.39%), followed by the Wild Egyptian variety (17.16%). The aromatic phenylalanine recorded the highest concentration of the essential amino acid bulk. The Rainbow variety contained 12.7 g-aa/kg protein, followed by the wild Egyptian variety with 4.9 g-aa/kg protein. In turn, glutamic was the most abundant amino acid of the non-essential amino acids, with 10.1, 4, 23.4, and 4 (g-aa/kg protein) for quinoa varieties, Wild Egyptian, American, Rainbow, Black, respectively. SDS-PAGE was used to identify the allelic variations in the seed storage protein profiles among the studied quinoa varieties. The studied quinoa varieties showed 23.81% of the polymorphism in the protein bands, with the mean band frequency of 0.881. The resulting protein bands fluctuated in the range between 115.02 and 16 kDa. With a similarity percentage (90%), Wild Egyptian and the Rainbow quinoa varieties can be classified in one clade

    Growth, Yield, Quality, and Phytochemical Behavior of Three Cultivars of Quinoa in Response to Moringa and Azolla Extracts under Organic Farming Conditions

    No full text
    Increased demand for quinoa as a functional food has resulted in more quinoa-growing areas and initiatives to increase grain production, particularly in organic agriculture. Quinoa seeds are a superfood with incredible nutritional benefits. They are abundant in secondary metabolites with significant medicinal activity. This report was consequently performed to investigate whether Azolla fliculoides (AE) or moringa leaf extract (MLE) foliar spray can be supplemented as organic extracts to enhance quinoa growth and productivity under organic farming. Three quinoa cultivars, KVL–SRA2 (C1), Chipaya (C2), and Q–37 (C3), were grown organically and subjected to foliar spraying with AE or MLE at a 20% ratio, as well as their combination (AE+MLE). Plant performance of the three cultivars was significantly enhanced by MLE or AE applications as compared with control plants. The highest outputs were obtained by AE+MLE treatment, which significantly increased the seed yield by about 29% as compared with untreated plants. Seed quality exhibited a marked increase in response to AE+MLE that was superior in this regard as it showed higher protein, carbohydrates, saponine, tannins, phenolics, and flavonoids content. The C3-cultivar demonstrated the highest productivity, saponine, and flavonoids levels as compared to the other cultivars. Overall, the current study indicated that foliar spray with AE+MLE could enhance growth and productivity as well as quality and pharmaceutical active ingredients of quinoa cultivars grown under farming conditions
    corecore