19 research outputs found

    Epidemiology of surgery associated acute kidney injury (EPIS-AKI): a prospective international observational multi-center clinical study

    Get PDF
    Purpose: The incidence, patient features, risk factors and outcomes of surgery-associated postoperative acute kidney injury (PO-AKI) across different countries and health care systems is unclear. Methods: We conducted an international prospective, observational, multi-center study in 30 countries in patients undergoing major surgery (> 2-h duration and postoperative intensive care unit (ICU) or high dependency unit admission). The primary endpoint was the occurrence of PO-AKI within 72 h of surgery defined by the Kidney Disease: Improving Global Outcomes (KDIGO) criteria. Secondary endpoints included PO-AKI severity and duration, use of renal replacement therapy (RRT), mortality, and ICU and hospital length of stay. Results: We studied 10,568 patients and 1945 (18.4%) developed PO-AKI (1236 (63.5%) KDIGO stage 1500 (25.7%) KDIGO stage 2209 (10.7%) KDIGO stage 3). In 33.8% PO-AKI was persistent, and 170/1945 (8.7%) of patients with PO-AKI received RRT in the ICU. Patients with PO-AKI had greater ICU (6.3% vs. 0.7%) and hospital (8.6% vs. 1.4%) mortality, and longer ICU (median 2 (Q1-Q3, 1-3) days vs. 3 (Q1-Q3, 1-6) days) and hospital length of stay (median 14 (Q1-Q3, 9-24) days vs. 10 (Q1-Q3, 7-17) days). Risk factors for PO-AKI included older age, comorbidities (hypertension, diabetes, chronic kidney disease), type, duration and urgency of surgery as well as intraoperative vasopressors, and aminoglycosides administration. Conclusion: In a comprehensive multinational study, approximately one in five patients develop PO-AKI after major surgery. Increasing severity of PO-AKI is associated with a progressive increase in adverse outcomes. Our findings indicate that PO-AKI represents a significant burden for health care worldwide

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Global overview of the management of acute cholecystitis during the COVID-19 pandemic (CHOLECOVID study)

    Get PDF
    Background: This study provides a global overview of the management of patients with acute cholecystitis during the initial phase of the COVID-19 pandemic. Methods: CHOLECOVID is an international, multicentre, observational comparative study of patients admitted to hospital with acute cholecystitis during the COVID-19 pandemic. Data on management were collected for a 2-month study interval coincident with the WHO declaration of the SARS-CoV-2 pandemic and compared with an equivalent pre-pandemic time interval. Mediation analysis examined the influence of SARS-COV-2 infection on 30-day mortality. Results: This study collected data on 9783 patients with acute cholecystitis admitted to 247 hospitals across the world. The pandemic was associated with reduced availability of surgical workforce and operating facilities globally, a significant shift to worse severity of disease, and increased use of conservative management. There was a reduction (both absolute and proportionate) in the number of patients undergoing cholecystectomy from 3095 patients (56.2 per cent) pre-pandemic to 1998 patients (46.2 per cent) during the pandemic but there was no difference in 30-day all-cause mortality after cholecystectomy comparing the pre-pandemic interval with the pandemic (13 patients (0.4 per cent) pre-pandemic to 13 patients (0.6 per cent) pandemic; P = 0.355). In mediation analysis, an admission with acute cholecystitis during the pandemic was associated with a non-significant increased risk of death (OR 1.29, 95 per cent c.i. 0.93 to 1.79, P = 0.121). Conclusion: CHOLECOVID provides a unique overview of the treatment of patients with cholecystitis across the globe during the first months of the SARS-CoV-2 pandemic. The study highlights the need for system resilience in retention of elective surgical activity. Cholecystectomy was associated with a low risk of mortality and deferral of treatment results in an increase in avoidable morbidity that represents the non-COVID cost of this pandemic

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial

    Get PDF
    Background Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear. Methods RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047. Findings Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths. Interpretation Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population

    Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial

    Get PDF
    Background Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain. Methods RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov , NCT00541047 . Findings Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths. Interpretation Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy. Funding Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society

    Investigation of the In-Vivo Cytotoxicity and the In Silico-Prediction of MDM2-p53 Inhibitor Potential of Euphorbia peplus Methanolic Extract in Rats

    No full text
    This study explored the probable in vivo cardiac and renal toxicities together with in silico approaches for predicting the apoptogenic potential of Euphorbia peplus methanolic extract (EPME) in rats. Cardiac and renal injury biomarkers were estimated with histopathological and immunohistochemical evaluations of both kidney and heart. The probable underlying mechanism of E. peplus compounds to potentiate p53 activity is examined using Molecular Operating Environment (MOE) docking software and validated experimentally by immunohistochemical localization of p53 protein in the kidney and heart tissues. The gas chromatography/mass spectrometry analysis of E. peplus revealed the presence of nine different compounds dominated by di-(2-ethylhexyl) phthalate (DEHP). Significant elevations of troponin, creatine phosphokinase, creatine kinase–myocardium bound, lactate dehydrogenase, aspartate transaminase, alkaline phosphatase, urea, creatinine, and uric acid were evident in the EPME treated rats. The EPME treated rats showed strong renal and cardiac p53 expression and moderate cardiac TNF-α expression. Further, our in silico results predicted the higher affinity and good inhibition of DEHP, glyceryl linolenate, and lucenin 2 to the MDM2-p53 interface compared to the standard reference 15 a compound. Conclusively, EPME long-term exposure could adversely affect the cardiac and renal tissues probably due to their inflammatory and apoptotic activity. Moreover, the in silico study hypothesizes that EPME inhibits MDM2-mediated degradation of p53 suggesting possible anticancer potentials which confirmed experimental by strong p53 expression in renal and cardiac tissues
    corecore