105 research outputs found

    Iodine fortification plant screening process and accumulation in tomato fruits and potato Tubers

    Get PDF
    Iodine is an essential microelement for human health, and Recommended Daily Allowance (RDA) of such element should range from 40 to 200 \ub5g day-1. Because of the low iodine contents in vegetables, cereals, and many other foods, Iodine Deficiency Disorder (IDD) is one of the most widespread nutrient deficiency diseases in the world. Therefore, investigations of iodine uptake in plants with the aim of their fortification can help reaching the important health and social objective of IDD elimination. This study was conducted to determine the effects of the absorption of iodine from two different chemical forms - potassium iodide (I-) vs. potassium iodate (IO-3) - in a wide range of wild and cultivated plant species. Pot plants were irrigated with different concentrations of I- or IO-3, namely 0.05% and 0.1% (w/v) I-, and 0.05%, 0.1%, 0.2% and 0.5% (w/v) IO-3. Inhibiting effects on plant growth were observed after adding these amounts of iodine to the irrigation water. Plants wereable to tolerate better the higher levels of iodine as IO-3 rather than I- in the root environment. Among cultivated species, barley (Hordeum vulgare L.) showed the lowest, and maize (Zea mays L.) together with tobacco (Nicotiana tabacum L.) the highest biomass reductions due to iodine toxicity. After the screening, cultivated tomato and potato resulted good targets for a fortification rate study among the species screened. When fed with 0.05% iodine salts, potato (Solanum tuberosum L.) tubers and tomato (Solanum lycopersicum L.) fruits absorbed iodine up to 272 and 527 \ub5g/100 g FW from IO-3, and 1,875 and 3,900 \ub5g/100 g FW from I-. These uptake levels were well above the RDA of 150\ub5g day-1 for adults. Moreover, the agronomic efficiency of iodine accumulation of potato tubers and tomato fruits was calculated. Both plant organs showed greater accumulation efficiency for given unit of iodine from iodide than from iodate. This accumulation efficiency decreased in both potato tubers and tomato fruits at iodine concentrations higher than 0.05% for iodide, and at respectively 0.2% and 0.1% for iodate. On the basis of the uptake curve it was finally possible to calculate, although to be validated, the doses of supply in the irrigation water of iodine as iodate (0.028% for potato, and0.014% for tomato) as well as of iodide (0.004% for potato, and 0.002% for tomato), to reach the 150 \ub5g day-1 RDA for adults in 100 g of such vegetables, to efficiently control IDD

    Determination of Pesticide Residues in IV Range Artichoke (Cynara cardunculus L.) and Its Industrial Wastes

    Get PDF
    Fourth-range products are those types of fresh fruit and vegetables that are ready for raw consumption or after cooking, and belong to organic or integrated cultivations. These products are subject to mild post-harvesting processing procedures (selection, sorting, husking, cutting, and washing), and are afterwards packaged in packets or closed food plates, with an average shelf life of 5–10 days. Artichokes are stripped of the leaves, stems and outer bracts, and the remaining heads are washed with acidifying solutions. The A LC-MS/MS analytical method was developed and validated following SANTE guidelines for the detection of 220 pesticides. This work evaluated the distribution of pesticide residues among the fraction of artichokes obtained during the industrial processing, and the residues of their wastes left on the field were also investigated. The results showed quantifiable residues of one herbicide (pendimethalin) and four fungicides (azoxystrobin, propyzamide, tebuconazole, and pyraclostrobin). Pendimethalin was found in all samples, with the higher values in leaves 0.046 ± 8.2 mg/kg and in field waste 0.30 ± 6.7 mg/kg. Azoxystrobin was the most concentrated in the outer bracts (0.18 ± 2.9 mg/kg). The outer bracts showed the highest number of residues. The industrial waste showed a significant decrease in the number of residues and their concentration

    Is the Clinical Risk Score for Patients with Colorectal Liver Metastases Still Useable in the Era of Effective Neoadjuvant Chemotherapy?

    Get PDF
    Background: Several clinical risk scores (CRSs) for the outcome of patients with colorectal liver metastases have been validated, but not in patients undergoing neoadjuvant chemotherapy. Therefore, this study evaluates the predictive value of these CRSs in this specific group. Methods: Between January 2000 and December 2008, all patients undergoing a metastasectomy were analyzed and divided into two groups: 193 patients did not receive neoadjuvant chemotherapy (group A), and 159 patients received neoadjuvant chemotherapy (group B). In group B, the CRSs were calculated before and after administration of neoadjuvant chemotherapy. Results were evaluated by using the CRSs proposed by Nordlinger et al., Fong et al., Nagashima et al., and Konopke et al. Results: In groups A and B, the overall median survival was 43 and 47 months, respectively (P = 0.648). In group A, all CRSs used were of statistically significant predictive value. Before administration of neoadjuvant chemotherapy, only the Nordlinger score was of predictive value. After administration of neoadjuvant chemotherapy, all CRSs were of predictive value again, except for the Konopke score. Conclusions: Traditional CRSs are not a reliable prognostic tool when used in patients before treatment with neoadjuvant chemotherapy. However, CRSs assessed after the administration of neoadjuvant chemotherapy are useful to predict prognosis

    Defining patient outcomes in stage IV colorectal cancer: a prospective study with baseline stratification according to disease resectability status

    Get PDF
    BACKGROUND: Stage IV colorectal cancer encompasses a broad patient population in which both curative and palliative management strategies may be used. In a phase II study primarily designed to assess the efficacy of capecitabine and oxaliplatin, we were able to prospectively examine the outcomes of patients with stage IV colorectal cancer according to the baseline resectability status. METHODS: At enrolment, patients were stratified into three subgroups according to the resectability of liver disease and treatment intent: palliative chemotherapy (subgroup A), conversion therapy (subgroup B) or neoadjuvant therapy (subgroup C). All patients received chemotherapy with capecitabine 2000 mg m(-2) on days 1-14 and oxaliplatin 130 mg m(-2) on day 1 repeated every 3 weeks. Imaging was repeated every four cycles where feasible liver resection was undertaken after four or eight cycles of chemotherapy. RESULTS: Of 128 enrolled patients, 74, 22 and 32 were stratified into subgroups A, B and C, respectively. Attempt at curative liver resection was undertaken in 10 (45%) patients in subgroup B and 19 (59%) in subgroup C. The median overall survival was 14.6, 24.5 and 52.9 months in subgroups A, B and C, respectively. For patients in subgroups B and C who underwent an attempt at curative resection, 3-year progression-free survival was 10% in subgroup B and 37% for subgroup C. CONCLUSIONS: This prospective study shows the wide variation in outcome according to baseline resectability status and highlights the potential clinical value of a modified staging system to distinguish between these patient subgroups. British Journal of Cancer (2010) 102, 255-261. doi:10.1038/sj.bjc.6605508 www.bjcancer.com (C) 2010 Cancer Research U

    The CC-NB-LRR-Type Rdg2a Resistance Gene Confers Immunity to the Seed-Borne Barley Leaf Stripe Pathogen in the Absence of Hypersensitive Cell Death

    Get PDF
    BACKGROUND: Leaf stripe disease on barley (Hordeum vulgare) is caused by the seed-transmitted hemi-biotrophic fungus Pyrenophora graminea. Race-specific resistance to leaf stripe is controlled by two known Rdg (Resistance to Drechslera graminea) genes: the H. spontaneum-derived Rdg1a and Rdg2a, identified in H. vulgare. The aim of the present work was to isolate the Rdg2a leaf stripe resistance gene, to characterize the Rdg2a locus organization and evolution and to elucidate the histological bases of Rdg2a-based leaf stripe resistance. PRINCIPLE FINDINGS: We describe here the positional cloning and functional characterization of the leaf stripe resistance gene Rdg2a. At the Rdg2a locus, three sequence-related coiled-coil, nucleotide-binding site, and leucine-rich repeat (CC-NB-LRR) encoding genes were identified. Sequence comparisons suggested that paralogs of this resistance locus evolved through recent gene duplication, and were subjected to frequent sequence exchange. Transformation of the leaf stripe susceptible cv. Golden Promise with two Rdg2a-candidates under the control of their native 5′ regulatory sequences identified a member of the CC-NB-LRR gene family that conferred resistance against the Dg2 leaf stripe isolate, against which the Rdg2a-gene is effective. Histological analysis demonstrated that Rdg2a-mediated leaf stripe resistance involves autofluorescing cells and prevents pathogen colonization in the embryos without any detectable hypersensitive cell death response, supporting a cell wall reinforcement-based resistance mechanism. CONCLUSIONS: This work reports about the cloning of a resistance gene effective against a seed borne disease. We observed that Rdg2a was subjected to diversifying selection which is consistent with a model in which the R gene co-evolves with a pathogen effector(s) gene. We propose that inducible responses giving rise to physical and chemical barriers to infection in the cell walls and intercellular spaces of the barley embryo tissues represent mechanisms by which the CC-NB-LRR-encoding Rdg2a gene mediates resistance to leaf stripe in the absence of hypersensitive cell death.Davide Bulgarelli, Chiara Biselli, Nicholas C. Collins, Gabriella Consonni, Antonio M. Stanca, Paul Schulze-Lefert and Giampiero Val

    A multi-center study of COVID-19 patient prognosis using deep learning-based CT image analysis and electronic health records

    Get PDF
    Purpose: As of August 30th, there were in total 25.1 million confirmed cases and 845 thousand deaths caused by coronavirus disease of 2019 (COVID-19) worldwide. With overwhelming demands on medical resources, patient stratification based on their risks is essential. In this multi-center study, we built prognosis models to predict severity outcomes, combining patients� electronic health records (EHR), which included vital signs and laboratory data, with deep learning- and CT-based severity prediction. Method: We first developed a CT segmentation network using datasets from multiple institutions worldwide. Two biomarkers were extracted from the CT images: total opacity ratio (TOR) and consolidation ratio (CR). After obtaining TOR and CR, further prognosis analysis was conducted on datasets from INSTITUTE-1, INSTITUTE-2 and INSTITUTE-3. For each data cohort, generalized linear model (GLM) was applied for prognosis prediction. Results: For the deep learning model, the correlation coefficient of the network prediction and manual segmentation was 0.755, 0.919, and 0.824 for the three cohorts, respectively. The AUC (95 CI) of the final prognosis models was 0.85(0.77,0.92), 0.93(0.87,0.98), and 0.86(0.75,0.94) for INSTITUTE-1, INSTITUTE-2 and INSTITUTE-3 cohorts, respectively. Either TOR or CR exist in all three final prognosis models. Age, white blood cell (WBC), and platelet (PLT) were chosen predictors in two cohorts. Oxygen saturation (SpO2) was a chosen predictor in one cohort. Conclusion: The developed deep learning method can segment lung infection regions. Prognosis results indicated that age, SpO2, CT biomarkers, PLT, and WBC were the most important prognostic predictors of COVID-19 in our prognosis model. © 202

    The association between human endogenous retroviruses and multiple sclerosis: a systematic review and meta-analysis

    Get PDF
    Background: The interaction between genetic and environmental factors is crucial to multiple sclerosis (MS) pathogenesis. Human Endogenous Retroviruses (HERVs) are endogenous viral elements of the human genome whose expression is associated with MS. Objective: To perform a systematic review and meta-analysis and to assess qualitative and quantitative evidence on the expression of HERV families in MS patients. Methods: Medline, Embase and the Cochrane Library were searched for published studies on the association of HERVs and MS. Meta-analysis was performed on the HERV-W family. Odds Ratio (OR) and 95% confidence interval (CI) were calculated for association. Results: 43 reports were extracted (25 related to HERV-W, 13 to HERV-H, 9 to HERV-K, 5 to HRES-1 and 1 to HER-15 family). The analysis showed an association between expression of all HERV families and MS. For HERV-W, adequate data was available for meta-analysis. Results from meta-analyses of HERV-W were OR = 22.66 (95%CI 6.32 to 81.20) from 4 studies investigating MSRV/HERV-W(MS-associated retrovirus) envelope mRNA in peripheral blood mononuclear cells, OR = 44.11 (95%CI 12.95 to 150.30) from 6 studies of MSRV/ HERV-W polymerase mRNA in serum/plasma and OR = 6.00 (95%CI 3.35 to 10.74) from 4 studies of MSRV/HERV-W polymerase mRNA in CSF
    corecore