23 research outputs found

    Genetic and Morphological Diversity Assessment of Five Kalanchoe Genotypes by SCoT, ISSR and RAPD-PCR Markers

    Get PDF
    Determining the appropriate parents for breeding programs is the most important decision that plant breeders must make to maximize the genetic variability and produce excellent recombinant genotypes. Several methods are used to identify genotypes with desirable phenotypic features for breeding experiments. In this study, five kalanchoe genotypes were morphologically characterized by assessing plant height, number of inflorescences, number of flowers, flower length, flower diameter and number of petals. The analysis showed the distinction of yellow kalanchoe in the plant height trait, while the orange kalanchoe was distinguished in the number of inflorescences, the number of flowers and flower length traits, whereas the violet kalanchoe possessed the largest flower diameter and the highest number of petals. The molecular profiling was performed by random amplified polymorphism DNA (RAPD), inter-simple sequence repeats (ISSR) and start codon targeted (SCoT)-polymerase chain reaction (PCR) tools. Genomic DNA was extracted from young leaves and the PCR reactions were performed using ten primers for each SCoT, ISSR and RAPD marker. Only four out of ten primers showed amplicon profiles in all PCR markers. A total of 70 bands were generated by SCoT, ISSR and RAPD-PCR with 35 polymorphic bands and 35 monomorphic bands. The total number of bands of RAPD, ISSR and SCoT was 15, 17 and 38, respectively. The polymorphism percentages achieved by RAPD, ISSR and SCoT were 60.25%, 15% and 57%, respectively. The cluster analysis based on morphological data revealed two clusters. Cluster I consisted of violet and orange kalanchoe, and cluster II comprised red, yellow and purple kalanchoe. Whereas the cluster analysis based on molecular data revealed three clusters. Cluster I included only yellow kalanchoe, cluster II comprised orange and violet kalanchoe while cluster III comprised red, and purple kalanchoe. The study concluded that orange, violet and yellow kalanchoe are distinguished parents for breeding economically valued traits in kalanchoe. Also, the study concluded that SCoT and RAPD markers reproduced reliable banding patterns to assess the genetic polymorphism among kalanchoe genotypes that consider the basis stone for genetic improvements in ornamental plants

    Global overview of the management of acute cholecystitis during the COVID-19 pandemic (CHOLECOVID study)

    Get PDF
    Background: This study provides a global overview of the management of patients with acute cholecystitis during the initial phase of the COVID-19 pandemic. Methods: CHOLECOVID is an international, multicentre, observational comparative study of patients admitted to hospital with acute cholecystitis during the COVID-19 pandemic. Data on management were collected for a 2-month study interval coincident with the WHO declaration of the SARS-CoV-2 pandemic and compared with an equivalent pre-pandemic time interval. Mediation analysis examined the influence of SARS-COV-2 infection on 30-day mortality. Results: This study collected data on 9783 patients with acute cholecystitis admitted to 247 hospitals across the world. The pandemic was associated with reduced availability of surgical workforce and operating facilities globally, a significant shift to worse severity of disease, and increased use of conservative management. There was a reduction (both absolute and proportionate) in the number of patients undergoing cholecystectomy from 3095 patients (56.2 per cent) pre-pandemic to 1998 patients (46.2 per cent) during the pandemic but there was no difference in 30-day all-cause mortality after cholecystectomy comparing the pre-pandemic interval with the pandemic (13 patients (0.4 per cent) pre-pandemic to 13 patients (0.6 per cent) pandemic; P = 0.355). In mediation analysis, an admission with acute cholecystitis during the pandemic was associated with a non-significant increased risk of death (OR 1.29, 95 per cent c.i. 0.93 to 1.79, P = 0.121). Conclusion: CHOLECOVID provides a unique overview of the treatment of patients with cholecystitis across the globe during the first months of the SARS-CoV-2 pandemic. The study highlights the need for system resilience in retention of elective surgical activity. Cholecystectomy was associated with a low risk of mortality and deferral of treatment results in an increase in avoidable morbidity that represents the non-COVID cost of this pandemic

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Burnout among surgeons before and during the SARS-CoV-2 pandemic: an international survey

    Get PDF
    Background: SARS-CoV-2 pandemic has had many significant impacts within the surgical realm, and surgeons have been obligated to reconsider almost every aspect of daily clinical practice. Methods: This is a cross-sectional study reported in compliance with the CHERRIES guidelines and conducted through an online platform from June 14th to July 15th, 2020. The primary outcome was the burden of burnout during the pandemic indicated by the validated Shirom-Melamed Burnout Measure. Results: Nine hundred fifty-four surgeons completed the survey. The median length of practice was 10&nbsp;years; 78.2% included were male with a median age of 37&nbsp;years old, 39.5% were consultants, 68.9% were general surgeons, and 55.7% were affiliated with an academic institution. Overall, there was a significant increase in the mean burnout score during the pandemic; longer years of practice and older age were significantly associated with less burnout. There were significant reductions in the median number of outpatient visits, operated cases, on-call hours, emergency visits, and research work, so, 48.2% of respondents felt that the training resources were insufficient. The majority (81.3%) of respondents reported that their hospitals were included in the management of COVID-19, 66.5% felt their roles had been minimized; 41% were asked to assist in non-surgical medical practices, and 37.6% of respondents were included in COVID-19 management. Conclusions: There was a significant burnout among trainees. Almost all aspects of clinical and research activities were affected with a significant reduction in the volume of research, outpatient clinic visits, surgical procedures, on-call hours, and emergency cases hindering the training. Trial registration: The study was registered on clicaltrials.gov "NCT04433286" on 16/06/2020

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Control of Gas Emissions (N<sub>2</sub>O and CO<sub>2</sub>) Associated with Applied Different Rates of Nitrogen and Their Influences on Growth, Productivity, and Physio-Biochemical Attributes of Green Bean Plants Grown under Different Irrigation Methods

    No full text
    The use of nitrogenous fertilizers in agriculture can cause uncontrolled gas emissions, such as N2O and CO2, leading to global warming and serious climate change. In this study, we evaluated the greenhouse gases emissions (GHGs) that are concomitant with applied different rates of N fertilization, such as 60%, 70%, 80%, 90%, 100%, 110%, and 120% of the recommended dose in green beans grown under three irrigation systems (surface, subsurface, and drip irrigation). The obtained results showed that GHGs were positively correlated with increasing the rate of N fertilization. Meanwhile, the subsurface irrigation system followed by drip irrigation achieved the highest significant (p ≤ 0.05) values regarding the growth and pod yield attributes. Furthermore, N supplements at 90% and/or 100% of the recommended dose under the subsurface irrigation system led to the highest concentration of chlorophyll, vitamin C, total protein, and activities of antioxidant enzymes, including catalase (CAT), superoxide dismutase (SOD), and peroxidase (POX). Proline and pod fibers were decreased in parallel with increasing the N rate, while water use efficiency (WUE) was improved with increasing the rate of N supplements up to 100% or 110% of the recommended dose
    corecore