41 research outputs found

    Transcriptional Regulation of Quinoa Seed Quality: Identification of Novel Candidate Genetic Markers for Increased Protein Content

    Get PDF
    Quinoa (Chenopodium quinoa Willd.) is a crop that has great potential for increased cultivation in diverse climate regions. The seed protein quality obtained from this crop is high concerning the requirements to meet human nutritional needs, but the seed protein content is relatively low if compared to crops such as grain legumes. Increased seed protein content is desirable for increasing the economic viability of this crop in order for it to be used as a protein crop. In this study, we characterized three genotypes of quinoa with different levels of seed protein content. By performing RNA sequencing of developing seeds, we determined the genotype differences in gene expression and identified genetic polymorphisms that could be associated with increased protein content. Storage nutrient analyses of seeds of three quinoa genotypes (Titicaca, Pasankalla, and Regalona) from different ecoregions grown under controlled climate conditions showed that Pasankalla had the highest protein content (20%) and the lowest starch content (46%). Our seed transcriptome analyses revealed highly differentially expressed transcripts (DETs) in Pasankalla as compared to the other genotypes. These DETs encoded functions in sugar transport, starch and protein synthesis, genes regulating embryo size, and seed transcription factors. We selected 60 genes that encode functions in the central carbon metabolism and transcription factors as potential targets for the development of high-precision markers. Genetic polymorphisms, such as single nucleotide polymorphisms (SNPs) and base insertions and deletions (InDels), were found in 19 of the 60 selected genes, which can be further evaluated for the development of genetic markers for high seed protein content in quinoa. Increased cultivation of quinoa can contribute to a more diversified agriculture and support the plant protein diet shift. The identification of quinoa genotypes with contrasting seed quality can help establish a model system that can be used for the identification of precise breeding targets to improve the seed quality of quinoa. The data presented in this study based on nutrient and transcriptome analyses contribute to an enhanced understanding of the genetic regulation of seed quality traits in quinoa and suggest high-precision candidate markers for such traits

    B cell depletion in autoimmune diabetes:insights from murine models

    Get PDF
    INTRODUCTION: The incidence of type 1 diabetes (T1D) is rising for reasons that largely elude us. New strategies aimed at halting the disease process are needed. One type of immune cell thought to contribute to T1D is the B lymphocyte. The first Phase II trial of B cell depletion in new onset T1D patients indicated that this slowed the destruction of insulin-producing pancreatic beta cells. The mechanistic basis of the beneficial effects remains unclear. AREAS COVERED: Studies of B cell depletion and deficiency in animal models of T1D. How B cells can influence T cell-dependent autoimmune diabetes in animal models. The heterogeneity of B cell populations and current evidence for the potential contribution of specific B cell subsets to diabetes, with emphasis on marginal zone B cells and B1 B cells. EXPERT OPINION: B cells can influence the T cell response to islet antigens and B cell depletion or genetic deficiency is associated with decreased insulitis in animal models. New evidence suggests that B1 cells may contribute to diabetes pathogenesis. A better understanding of the roles of individual B cell subsets in disease will permit fine-tuning of therapeutic strategies to modify these populations

    Elective Cancer Surgery in COVID-19-Free Surgical Pathways During the SARS-CoV-2 Pandemic: An International, Multicenter, Comparative Cohort Study.

    Get PDF
    PURPOSE: As cancer surgery restarts after the first COVID-19 wave, health care providers urgently require data to determine where elective surgery is best performed. This study aimed to determine whether COVID-19-free surgical pathways were associated with lower postoperative pulmonary complication rates compared with hospitals with no defined pathway. PATIENTS AND METHODS: This international, multicenter cohort study included patients who underwent elective surgery for 10 solid cancer types without preoperative suspicion of SARS-CoV-2. Participating hospitals included patients from local emergence of SARS-CoV-2 until April 19, 2020. At the time of surgery, hospitals were defined as having a COVID-19-free surgical pathway (complete segregation of the operating theater, critical care, and inpatient ward areas) or no defined pathway (incomplete or no segregation, areas shared with patients with COVID-19). The primary outcome was 30-day postoperative pulmonary complications (pneumonia, acute respiratory distress syndrome, unexpected ventilation). RESULTS: Of 9,171 patients from 447 hospitals in 55 countries, 2,481 were operated on in COVID-19-free surgical pathways. Patients who underwent surgery within COVID-19-free surgical pathways were younger with fewer comorbidities than those in hospitals with no defined pathway but with similar proportions of major surgery. After adjustment, pulmonary complication rates were lower with COVID-19-free surgical pathways (2.2% v 4.9%; adjusted odds ratio [aOR], 0.62; 95% CI, 0.44 to 0.86). This was consistent in sensitivity analyses for low-risk patients (American Society of Anesthesiologists grade 1/2), propensity score-matched models, and patients with negative SARS-CoV-2 preoperative tests. The postoperative SARS-CoV-2 infection rate was also lower in COVID-19-free surgical pathways (2.1% v 3.6%; aOR, 0.53; 95% CI, 0.36 to 0.76). CONCLUSION: Within available resources, dedicated COVID-19-free surgical pathways should be established to provide safe elective cancer surgery during current and before future SARS-CoV-2 outbreaks

    Elective cancer surgery in COVID-19-free surgical pathways during the SARS-CoV-2 pandemic: An international, multicenter, comparative cohort study

    Get PDF
    PURPOSE As cancer surgery restarts after the first COVID-19 wave, health care providers urgently require data to determine where elective surgery is best performed. This study aimed to determine whether COVID-19–free surgical pathways were associated with lower postoperative pulmonary complication rates compared with hospitals with no defined pathway. PATIENTS AND METHODS This international, multicenter cohort study included patients who underwent elective surgery for 10 solid cancer types without preoperative suspicion of SARS-CoV-2. Participating hospitals included patients from local emergence of SARS-CoV-2 until April 19, 2020. At the time of surgery, hospitals were defined as having a COVID-19–free surgical pathway (complete segregation of the operating theater, critical care, and inpatient ward areas) or no defined pathway (incomplete or no segregation, areas shared with patients with COVID-19). The primary outcome was 30-day postoperative pulmonary complications (pneumonia, acute respiratory distress syndrome, unexpected ventilation). RESULTS Of 9,171 patients from 447 hospitals in 55 countries, 2,481 were operated on in COVID-19–free surgical pathways. Patients who underwent surgery within COVID-19–free surgical pathways were younger with fewer comorbidities than those in hospitals with no defined pathway but with similar proportions of major surgery. After adjustment, pulmonary complication rates were lower with COVID-19–free surgical pathways (2.2% v 4.9%; adjusted odds ratio [aOR], 0.62; 95% CI, 0.44 to 0.86). This was consistent in sensitivity analyses for low-risk patients (American Society of Anesthesiologists grade 1/2), propensity score–matched models, and patients with negative SARS-CoV-2 preoperative tests. The postoperative SARS-CoV-2 infection rate was also lower in COVID-19–free surgical pathways (2.1% v 3.6%; aOR, 0.53; 95% CI, 0.36 to 0.76). CONCLUSION Within available resources, dedicated COVID-19–free surgical pathways should be established to provide safe elective cancer surgery during current and before future SARS-CoV-2 outbreaks

    Evolution of the use of corticosteroids for the treatment of hospitalised COVID-19 patients in Spain between March and November 2020: SEMI-COVID national registry

    Get PDF
    Objectives: Since the results of the RECOVERY trial, WHO recommendations about the use of corticosteroids (CTs) in COVID-19 have changed. The aim of the study is to analyse the evolutive use of CTs in Spain during the pandemic to assess the potential influence of new recommendations. Material and methods: A retrospective, descriptive, and observational study was conducted on adults hospitalised due to COVID-19 in Spain who were included in the SEMI-COVID- 19 Registry from March to November 2020. Results: CTs were used in 6053 (36.21%) of the included patients. The patients were older (mean (SD)) (69.6 (14.6) vs. 66.0 (16.8) years; p < 0.001), with hypertension (57.0% vs. 47.7%; p < 0.001), obesity (26.4% vs. 19.3%; p < 0.0001), and multimorbidity prevalence (20.6% vs. 16.1%; p < 0.001). These patients had higher values (mean (95% CI)) of C-reactive protein (CRP) (86 (32.7-160) vs. 49.3 (16-109) mg/dL; p < 0.001), ferritin (791 (393-1534) vs. 470 (236- 996) µg/dL; p < 0.001), D dimer (750 (430-1400) vs. 617 (345-1180) µg/dL; p < 0.001), and lower Sp02/Fi02 (266 (91.1) vs. 301 (101); p < 0.001). Since June 2020, there was an increment in the use of CTs (March vs. September; p < 0.001). Overall, 20% did not receive steroids, and 40% received less than 200 mg accumulated prednisone equivalent dose (APED). Severe patients are treated with higher doses. The mortality benefit was observed in patients with oxygen saturation </=90%. Conclusions: Patients with greater comorbidity, severity, and inflammatory markers were those treated with CTs. In severe patients, there is a trend towards the use of higher doses. The mortality benefit was observed in patients with oxygen saturation </=90%

    Meat and Nicotinamide:A Causal Role in Human Evolution, History, and Demographics

    Get PDF
    Hunting for meat was a critical step in all animal and human evolution. A key brain-trophic element in meat is vitamin B 3 /nicotinamide. The supply of meat and nicotinamide steadily increased from the Cambrian origin of animal predators ratcheting ever larger brains. This culminated in the 3-million-year evolution of Homo sapiens and our overall demographic success. We view human evolution, recent history, and agricultural and demographic transitions in the light of meat and nicotinamide intake. A biochemical and immunological switch is highlighted that affects fertility in the ‘de novo’ tryptophan-to-kynurenine-nicotinamide ‘immune tolerance’ pathway. Longevity relates to nicotinamide adenine dinucleotide consumer pathways. High meat intake correlates with moderate fertility, high intelligence, good health, and longevity with consequent population stability, whereas low meat/high cereal intake (short of starvation) correlates with high fertility, disease, and population booms and busts. Too high a meat intake and fertility falls below replacement levels. Reducing variances in meat consumption might help stabilise population growth and improve human capital

    Advances in the Household Archaeology of Highland Mesoamerica

    Full text link

    The tempo of the Iberian megalithic rituals in the European context: The cemetery of Panoría

    No full text
    Our ability to build precise narratives regarding megalithic societies largely depends on the chronology of the multi-ritual events that usually shaped these complex sites. The cemetery of Panoría offers an excellent opportunity for exploring ritual complexity in Iberia through radiocarbon chronology, as four of the nine recently excavated dolmens are remarkably well preserved. For this purpose, seventy-three radiocarbon dates were obtained and analysed within a Bayesian framework. The resulting refined chronology has led us to three main conclusions: i) in all tombs, the second half of the 4th millennium cal BC was an intensive but brief period of funerary depositions, probably over three to six generations; ii) after a long hiatus, most of the dolmens were reused in the 25th and 21st centuries cal BC during even shorter periods, spanning just a few decades and approximately one to four generations; and (iii) long after the funerary rituals had ended in the 21st century, the memory of the cemetery was revived in Late Antiquity. These short, punctuated periods of use are highly consistent with those seen in a growing number of European megalithic monuments. From Britain to Iberia, a pattern of short spans of use is dramatically changing our perception of the social and political roles of these complex monuments. © 2022 The Author
    corecore