15 research outputs found

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Cell-type–specific eQTL of primary melanocytes facilitates identification of melanoma susceptibility genes

    Get PDF
    Most expression quantitative trait locus (eQTL) studies to date have been performed in heterogeneous tissues as opposed to specific cell types. To better understand the cell-type–specific regulatory landscape of human melanocytes, which give rise to melanoma but account for <5% of typical human skin biopsies, we performed an eQTL analysis in primary melanocyte cultures from 106 newborn males. We identified 597,335 cis-eQTL SNPs prior to linkage disequilibrium (LD) pruning and 4997 eGenes (FDR < 0.05). Melanocyte eQTLs differed considerably from those identified in the 44 GTEx tissue types, including skin. Over a third of melanocyte eGenes, including key genes in melanin synthesis pathways, were unique to melanocytes compared to those of GTEx skin tissues or TCGA melanomas. The melanocyte data set also identified trans-eQTLs, including those connecting a pigmentation-associated functional SNP with four genes, likely through cis-regulation of IRF4. Melanocyte eQTLs are enriched in cis-regulatory signatures found in melanocytes as well as in melanoma-associated variants identified through genome-wide association studies. Melanocyte eQTLs also colocalized with melanoma GWAS variants in five known loci. Finally, a transcriptome-wide association study using melanocyte eQTLs uncovered four novel susceptibility loci, where imputed expression levels of five genes (ZFP90, HEBP1, MSC, CBWD1, and RP11-383H13.1) were associated with melanoma at genome-wide significant P-values. Our data highlight the utility of lineage-specific eQTL resources for annotating GWAS findings, and present a robust database for genomic research of melanoma risk and melanocyte biology

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Rhizome, root/sediment interactions, aerenchyma and internal pressure changes in seagrasses

    Full text link
    © Springer International Publishing AG, part of Springer Nature 2018. Life in seawater presents several challenges for seagrasses owing to low O 2 and CO 2 solubility and slow gas diffusion rates. Seagrasses have evolved numerous adaptations to these environmental conditions including porous tissue providing low-resistance internal gas channels (aerenchyma) and carbon concentration mechanisms involving the enzyme carbonic anhydrase. Moreover, seagrasses grow in reduced, anoxic sediments, and aerobic metabolism in roots and rhizomes therefore has to be sustained via rapid O 2 transport through the aerenchyma. Tissue aeration is driven by internal concentration gradients between leaves and belowground tissues, where the leaves are the source of O 2 and the rhizomes and roots function as O 2 sinks. Inadequate internal aeration e.g., due to low O 2 availability in the surrounding water during night time, can lead to sulphide intrusion into roots and rhizomes, which has been linked to enhanced seagrass mortality. Under favourable conditions, however, seagrasses leak O 2 and dissolved organic carbon into the rhizosphere, where it maintains oxic microzones protecting the plant against reduced phytotoxic compounds and generates dynamic chemical microgradients that modulate the rhizosphere microenvironment. Local radial O 2 loss from belowground tissues of seagrasses leads to sulphide oxidation in the rhizosphere, which generates protons and results in local acidification. Such low-pH microniches can lead to dissolution of carbonates and protolytic phosphorus solubilisation in carbonate-rich sediments. The seagrass rhizosphere is also characterised by numerous high-pH microniches indicative of local stimulation of proton consuming microbial processes such as sulphate reduction via root/rhizome exudates and/or release of alkaline substances. High sediment pH shifts the sulphide speciation away from H 2 S towards non-tissue-penetrating HS - ions, which can alleviate the belowground tissue exposure to phytotoxic H 2 S. High sulphide production can also lead to iron and phosphorus mobilization through sulphide-induced reduction of insoluble Fe(III)oxyhydroxides to dissolved Fe(II) with concomitant phosphorus release to the porewater. Adequate internal tissue aeration is thus of vital importance for seagrasses as it ensures aerobic metabolism in distal parts of the roots and provides protection against intrusion of phytotoxins from the surrounding sediment

    Evaluation of Recipients of Positive and Negative Secondary Findings Evaluations in a Hybrid CLIA-Research Sequencing Pilot

    Full text link
    © 2018 While consensus regarding the return of secondary genomic findings in the clinical setting has been reached, debate about such findings in the research setting remains. We developed a hybrid, research-clinical translational genomics process for research exome data coupled with a CLIA-validated secondary findings analysis. Eleven intramural investigators from ten institutes at the National Institutes of Health piloted this process. Nearly 1,200 individuals were sequenced and 14 secondary findings were identified in 18 participants. Positive secondary findings were returned by a genetic counselor following a standardized protocol, including referrals for specialty follow-up care for the secondary finding local to the participants. Interviews were undertaken with 13 participants 4 months after receipt of a positive report. These participants reported minimal psychologic distress within a process to assimilate their results. Of the 13, 9 reported accessing the recommended health care services. A sample of 107 participants who received a negative findings report were surveyed 4 months after receiving it. They demonstrated good understanding of the negative secondary findings result and most expressed reassurance (64%) from that report. However, a notable minority (up to 17%) expressed confusion regarding the distinction of primary from secondary findings. This pilot shows it is feasible to couple CLIA-compliant secondary findings to research sequencing with minimal harms. Participants managed the surprise of a secondary finding with most following recommended follow up, yet some with negative findings conflated secondary and primary findings. Additional work is needed to understand barriers to follow-up care and help participants distinguish secondary from primary findings

    Breeding Progress and Future Challenges: Abiotic Stresses

    No full text
    Mungbean is a short-season tropical grain legume grown on some six million hectares each year. Though predominantly a crop of smallholder farmers and subsistence agriculture mungbean is increasingly seen as a high value crop for international markets with broad acre production under modern farming systems established in Australia, South America, West Asia and Africa. Key benefits of mungbean are its nutritional and monetary value. It provides a short duration, flexible disease break when fit into intensive wheat, rice and summer cereal rotations and its self-sufficiency for nitrogen. The short growing season of 55–100 days places a ceiling on productivity which is further impacted by the traditional low-input farming systems where mungbean is most frequently produced; global yield averages are 0.5 tonnes per hectare though 3 tonnes per hectare is considered achievable under favourable conditions. Increased reliability of mungbean in subsistence systems has been achieved by developing shorter duration, more determinate ideotypes and by the manipulation of sowing time. The strategy of reducing exposure to risk was very successful in transforming mungbean rather than identifying and breeding inherent resilience. The major abiotic stresses of mungbean presented here are drought, heat, waterlogging, low temperatures and salinity. Sources of tolerance identified for all of these stresses have been identified in the germplasm collections of cultivated mungbean as well as wild relatives. Future research efforts must combine known sources of genetic variation with the investigation into the biochemical and physiological processes in order to understand and breed for tolerance to abiotic stress in mungbean
    corecore