55 research outputs found

    Oral mucosal lesions in skin diseased patients attending a dermatologic clinic: a cross-sectional study in Sudan

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>So far there have been no studies focusing on the prevalence of a wide spectrum of oral mucosal lesions (OML) in patients with dermatologic diseases. This is noteworthy as skin lesions are strongly associated with oral lesions and could easily be neglected by dentists. This study aimed to estimate the frequency and socio-behavioural correlates of OML in skin diseased patients attending outpatient's facility of Khartoum Teaching Hospital - Dermatology Clinic, Sudan.</p> <p>Methods</p> <p>A cross-sectional hospital-based study was conducted in Khartoum from October 2008 to January 2009. A total of 588 patients (mean age 37.2 ± 16 years, 50.3% females) completed an oral examination and a personal interview of which 544 patients (mean age 37.1 ± 15.9 years, 50% females) with confirmed skin disease diagnosis were included for further analyses. OML were recorded using the World Health Organization criteria (WHO). Biopsy and smear were used as adjuvant techniques for confirmation. Data were analysed using the Statistical Package for Social Science (Version 15.0.1). Cross tabulation and Chi-square with Fisher's exact test were used.</p> <p>Results</p> <p>A total of 438 OML were registered in 315 (57.9%, males: 54.6% versus females: 45.6%, p < 0.05) skin diseased patients. Thus, a certain number of patients had more than one type of OML. <it>Tongue lesions </it>were the most frequently diagnosed OML (23.3%), followed in descending order by <it>white lesions </it>(19.1%), <it>red and blue lesions </it>(11%) and <it>vesiculobullous diseases </it>(6%). OML in various skin diseases were; <it>vesiculobullous reaction pattern </it>(72.2%), <it>lichenoid reaction pattern </it>(60.5%), <it>infectious lesions </it>(56.5%), <it>psoriasiform reaction pattern </it>(56.7%), and <it>spongiotic reaction pattern </it>(46.8%). Presence of OML in skin diseased patients was most frequent in older age groups (62.4% older versus 52.7% younger, p < 0.05), in males (63.2% males versus 52.6% females, p < 0.05), patients with a systemic disease (65.2% with systemic versus 51.9% without systemic disease, p < 0.05) and among current users of smokeless tobacco (toombak) (77% current use versus 54.8% no use, p < 0.00).</p> <p>Conclusions</p> <p>OML were frequently diagnosed in skin diseased patients and varied systematically with age, gender, systemic condition and use of toombak. The high prevalence of OML emphasizes the importance of routine examination of oral mucosa in a dermatology clinic.</p

    Systemic chemotherapy with or without cetuximab in patients with resectable colorectal liver metastasis (New EPOC): long-term results of a multicentre, randomised, controlled, phase 3 trial.

    Get PDF
    BACKGROUND: The interim analysis of the multicentre New EPOC trial in patients with resectable colorectal liver metastasis showed a significant reduction in progression-free survival in patients allocated to cetuximab plus chemotherapy compared with those given chemotherapy alone. The focus of the present analysis was to assess the effect on overall survival. METHODS: New EPOC was a multicentre, open-label, randomised, controlled, phase 3 trial. Adult patients (aged ≥18 years) with KRAS wild-type (codons 12, 13, and 61) resectable or suboptimally resectable colorectal liver metastases and a WHO performance status of 0-2 were randomly assigned (1:1) to receive chemotherapy with or without cetuximab before and after liver resection. Randomisation was done centrally with minimisation factors of surgical centre, poor prognosis cancer, and previous adjuvant treatment with oxaliplatin. Chemotherapy consisted of oxaliplatin 85 mg/m2 administered intravenously over 2 h, l-folinic acid (175 mg flat dose administered intravenously over 2 h) or d,l-folinic acid (350 mg flat dose administered intravenously over 2 h), and fluorouracil bolus 400 mg/m2 administered intravenously over 5 min, followed by a 46 h infusion of fluorouracil 2400 mg/m2 repeated every 2 weeks (regimen one), or oxaliplatin 130 mg/m2 administered intravenously over 2 h and oral capecitabine 1000 mg/m2 twice daily on days 1-14 repeated every 3 weeks (regimen two). Patients who had received adjuvant oxaliplatin could receive irinotecan 180 mg/m2 intravenously over 30 min with fluorouracil instead of oxaliplatin (regimen three). Cetuximab was given intravenously, 500 mg/m2 every 2 weeks with regimen one and three or a loading dose of 400 mg/m2 followed by a weekly infusion of 250 mg/m2 with regimen two. The primary endpoint of progression-free survival was published previously. Secondary endpoints were overall survival, preoperative response, pathological resection status, and safety. Trial recruitment was halted prematurely on the advice of the Trial Steering Committee on Nov 1, 2012. All analyses (except safety) were done on the intention-to-treat population. Safety analyses included all randomly assigned patients. This trial is registered with ISRCTN, number 22944367. FINDINGS: Between Feb 26, 2007, and Oct 12, 2012, 257 eligible patients were randomly assigned to chemotherapy with cetuximab (n=129) or without cetuximab (n=128). This analysis was carried out 5 years after the last patient was recruited, as defined in the protocol, at a median follow-up of 66·7 months (IQR 58·0-77·5). Median progression-free survival was 22·2 months (95% CI 18·3-26·8) in the chemotherapy alone group and 15·5 months (13·8-19·0) in the chemotherapy plus cetuximab group (hazard ratio [HR] 1·17, 95% CI 0·87-1·56; p=0·304). Median overall survival was 81·0 months (59·6 to not reached) in the chemotherapy alone group and 55·4 months (43·5-71·5) in the chemotherapy plus cetuximab group (HR 1·45, 1·02-2·05; p=0·036). There was no significant difference in the secondary outcomes of preoperative response or pathological resection status between groups. Five deaths might have been treatment-related (one in the chemotherapy alone group and four in the chemotherapy plus cetuximab group). The most common grade 3-4 adverse events reported were: neutrophil count decreased (26 [19%] of 134 in the chemotherapy alone group vs 21 [15%] of 137 in the chemotherapy plus cetuximab group), diarrhoea (13 [10%] vs 14 [10%]), skin rash (one [1%] vs 22 [16%]), thromboembolic events (ten [7%] vs 11 [8%]), lethargy (ten [7%] vs nine [7%]), oral mucositis (three [2%] vs 14 [10%]), vomiting (seven [5%] vs seven [5%]), peripheral neuropathy (eight [6%] vs five [4%]), and pain (six [4%] vs six [4%]). INTERPRETATION: Although the addition of cetuximab to chemotherapy improves the overall survival in some studies in patients with advanced, inoperable metastatic disease, its use in the perioperative setting in patients with operable disease confers a significant disadvantage in terms of overall survival. Cetuximab should not be used in this setting. FUNDING: Cancer Research UK

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Risk of Bowel Obstruction in Patients Undergoing Neoadjuvant Chemotherapy for High-risk Colon Cancer

    Get PDF
    Objective: This study aimed to identify risk criteria available before the point of treatment initiation that can be used to stratify the risk of obstruction in patients undergoing neoadjuvant chemotherapy (NAC) for high-risk colon cancer. Background: Global implementation of NAC for colon cancer, informed by the FOxTROT trial, may increase the risk of bowel obstruction. Methods: A case-control study, nested within an international randomized controlled trial (FOxTROT; ClinicalTrials.gov: NCT00647530). Patients with high-risk operable colon cancer (radiologically staged T3-4 N0-2 M0) that were randomized to NAC and developed large bowel obstruction were identified. First, clinical outcomes were compared between patients receiving NAC in FOxTROT who did and did not develop obstruction. Second, obstructed patients (cases) were age-matched and sex-matched with patients who did not develop obstruction (controls) in a 1:3 ratio using random sampling. Bayesian conditional mixed-effects logistic regression modeling was used to explore clinical, radiologic, and pathologic features associated with obstruction. The absolute risk of obstruction based on the presence or absence of risk criteria was estimated for all patients receiving NAC. Results: Of 1053 patients randomized in FOxTROT, 699 received NAC, of whom 30 (4.3%) developed obstruction. Patients underwent care in European hospitals including 88 UK, 7 Danish, and 3 Swedish centers. There was more open surgery (65.4% vs 38.0%, P=0.01) and a higher pR1 rate in obstructed patients (12.0% vs 3.8%, P=0.004), but otherwise comparable postoperative outcomes. In the case-control–matched Bayesian model, 2 independent risk criteria were identified: (1) obstructing disease on endoscopy and/or being unable to pass through the tumor [adjusted odds ratio: 9.09, 95% credible interval: 2.34–39.66] and stricturing disease on radiology or endoscopy (odds ratio: 7.18, 95% CI: 1.84–32.34). Three risk groups were defined according to the presence or absence of these criteria: 63.4% (443/698) of patients were at very low risk (10%). Conclusions: Safe selection for NAC for colon cancer can be informed by using 2 features that are available before treatment initiation and identifying a small number of patients with a high risk of preoperative obstruction

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
    corecore