51 research outputs found

    Proteome Regulation during Olea europaea Fruit Development

    Get PDF
    Widespread in the Mediterranean basin, Olea europaea trees are gaining worldwide popularity for the nutritional and cancer-protective properties of the oil, mechanically extracted from ripe fruits. Fruit development is a physiological process with remarkable impact on the modulation of the biosynthesis of compounds affecting the quality of the drupes as well as the final composition of the olive oil. Proteomics offers the possibility to dig deeper into the major changes during fruit development, including the important phase of ripening, and to classify temporal patterns of protein accumulation occurring during these complex physiological processes.In this work, we started monitoring the proteome variations associated with olive fruit development by using comparative proteomics coupled to mass spectrometry. Proteins extracted from drupes at three different developmental stages were separated on 2-DE and subjected to image analysis. 247 protein spots were revealed as differentially accumulated. Proteins were identified from a total of 121 spots and discussed in relation to olive drupe metabolic changes occurring during fruit development. In order to evaluate if changes observed at the protein level were consistent with changes of mRNAs, proteomic data produced in the present work were compared with transcriptomic data elaborated during previous studies.This study identifies a number of proteins responsible for quality traits of cv. Coratina, with particular regard to proteins associated to the metabolism of fatty acids, phenolic and aroma compounds. Proteins involved in fruit photosynthesis have been also identified and their pivotal contribution in oleogenesis has been discussed. To date, this study represents the first characterization of the olive fruit proteome during development, providing new insights into fruit metabolism and oil accumulation process

    Phenotypic Switching of Nonpeptidergic Cutaneous Sensory Neurons following Peripheral Nerve Injury

    Get PDF
    In adult mammals, the phenotype of half of all pain-sensing (nociceptive) sensory neurons is tonically modulated by growth factors in the glial cell line-derived neurotrophic factor (GDNF) family that includes GDNF, artemin (ARTN) and neurturin (NRTN). Each family member binds a distinct GFRα family co-receptor, such that GDNF, NRTN and ARTN bind GFRα1, -α2, and -α3, respectively. Previous studies revealed transcriptional regulation of all three receptors in following axotomy, possibly in response to changes in growth factor availability. Here, we examined changes in the expression of GFRα1-3 in response to injury in vivo and in vitro. We found that after dissociation of adult sensory ganglia, up to 27% of neurons die within 4 days (d) in culture and this can be prevented by nerve growth factor (NGF), GDNF and ARTN, but not NRTN. Moreover, up-regulation of ATF3 (a marker of neuronal injury) in vitro could be prevented by NGF and ARTN, but not by GDNF or NRTN. The lack of NRTN efficacy was correlated with rapid and near-complete loss of GFRα2 immunoreactivity. By retrogradely-labeling cutaneous afferents in vivo prior to nerve cut, we demonstrated that GFRα2-positive neurons switch phenotype following injury and begin to express GFRα3 as well as the capsaicin receptor, transient receptor potential vanilloid 1(TRPV1), an important transducer of noxious stimuli. This switch was correlated with down-regulation of Runt-related transcription factor 1 (Runx1), a transcription factor that controls expression of GFRα2 and TRPV1 during development. These studies show that NRTN-responsive neurons are unique with respect to their plasticity and response to injury, and suggest that Runx1 plays an ongoing modulatory role in the adult

    Human subcortical brain asymmetries in 15,847 people worldwide reveal effects of age and sex

    Get PDF
    The two hemispheres of the human brain differ functionally and structurally. Despite over a century of research, the extent to which brain asymmetry is influenced by sex, handedness, age, and genetic factors is still controversial. Here we present the largest ever analysis of subcortical brain asymmetries, in a harmonized multi-site study using meta-analysis methods. Volumetric asymmetry of seven subcortical structures was assessed in 15,847 MRI scans from 52 datasets worldwide. There were sex differences in the asymmetry of the globus pallidus and putamen. Heritability estimates, derived from 1170 subjects belonging to 71 extended pedigrees, revealed that additive genetic factors influenced the asymmetry of these two structures and that of the hippocampus and thalamus. Handedness had no detectable effect on subcortical asymmetries, even in this unprecedented sample size, but the asymmetry of the putamen varied with age. Genetic drivers of asymmetry in the hippocampus, thalamus and basal ganglia may affect variability in human cognition, including susceptibility to psychiatric disorders

    Salivary Markers for Oral Cancer Detection

    Get PDF
    Oral cancer refers to all malignancies that arise in the oral cavity, lips and pharynx, with 90% of all oral cancers being oral squamous cell carcinoma. Despite the recent treatment advances, oral cancer is reported as having one of the highest mortality ratios amongst other malignancies and this can much be attributed to the late diagnosis of the disease. Saliva has long been tested as a valuable tool for drug monitoring and the diagnosis systemic diseases among which oral cancer. The new emerging technologies in molecular biology have enabled the discovery of new molecular markers (DNA, RNA and protein markers) for oral cancer diagnosis and surveillance which are discussed in the current review

    Health sector spending and spending on HIV/AIDS, tuberculosis, and malaria, and development assistance for health: progress towards Sustainable Development Goal 3

    Get PDF
    Background: Sustainable Development Goal (SDG) 3 aims to “ensure healthy lives and promote well-being for all at all ages”. While a substantial effort has been made to quantify progress towards SDG3, less research has focused on tracking spending towards this goal. We used spending estimates to measure progress in financing the priority areas of SDG3, examine the association between outcomes and financing, and identify where resource gains are most needed to achieve the SDG3 indicators for which data are available. Methods: We estimated domestic health spending, disaggregated by source (government, out-of-pocket, and prepaid private) from 1995 to 2017 for 195 countries and territories. For disease-specific health spending, we estimated spending for HIV/AIDS and tuberculosis for 135 low-income and middle-income countries, and malaria in 106 malaria-endemic countries, from 2000 to 2017. We also estimated development assistance for health (DAH) from 1990 to 2019, by source, disbursing development agency, recipient, and health focus area, including DAH for pandemic preparedness. Finally, we estimated future health spending for 195 countries and territories from 2018 until 2030. We report all spending estimates in inflation-adjusted 2019 US,unlessotherwisestated.Findings:SincethedevelopmentandimplementationoftheSDGsin2015,globalhealthspendinghasincreased,reaching, unless otherwise stated. Findings: Since the development and implementation of the SDGs in 2015, global health spending has increased, reaching 7·9 trillion (95% uncertainty interval 7·8–8·0) in 2017 and is expected to increase to 110trillion(107112)by2030.In2017,inlowincomeandmiddleincomecountriesspendingonHIV/AIDSwas11·0 trillion (10·7–11·2) by 2030. In 2017, in low-income and middle-income countries spending on HIV/AIDS was 20·2 billion (17·0–25·0) and on tuberculosis it was 109billion(103118),andinmalariaendemiccountriesspendingonmalariawas10·9 billion (10·3–11·8), and in malaria-endemic countries spending on malaria was 5·1 billion (4·9–5·4). Development assistance for health was 406billionin2019andHIV/AIDShasbeenthehealthfocusareatoreceivethehighestcontributionsince2004.In2019,40·6 billion in 2019 and HIV/AIDS has been the health focus area to receive the highest contribution since 2004. In 2019, 374 million of DAH was provided for pandemic preparedness, less than 1% of DAH. Although spending has increased across HIV/AIDS, tuberculosis, and malaria since 2015, spending has not increased in all countries, and outcomes in terms of prevalence, incidence, and per-capita spending have been mixed. The proportion of health spending from pooled sources is expected to increase from 81·6% (81·6–81·7) in 2015 to 83·1% (82·8–83·3) in 2030. Interpretation: Health spending on SDG3 priority areas has increased, but not in all countries, and progress towards meeting the SDG3 targets has been mixed and has varied by country and by target. The evidence on the scale-up of spending and improvements in health outcomes suggest a nuanced relationship, such that increases in spending do not always results in improvements in outcomes. Although countries will probably need more resources to achieve SDG3, other constraints in the broader health system such as inefficient allocation of resources across interventions and populations, weak governance systems, human resource shortages, and drug shortages, will also need to be addressed. Funding: The Bill & Melinda Gates Foundatio

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore