5 research outputs found

    The USDA Barley Core Collection:Genetic Diversity, Population Structure, and Potential for Genome-Wide Association Studies

    Get PDF
    New sources of genetic diversity must be incorporated into plant breeding programs if they are to continue increasing grain yield and quality, and tolerance to abiotic and biotic stresses. Germplasm collections provide a source of genetic and phenotypic diversity, but characterization of these resources is required to increase their utility for breeding programs. We used a barley SNP iSelect platform with 7,842 SNPs to genotype 2,417 barley accessions sampled from the USDA National Small Grains Collection of 33,176 accessions. Most of the accessions in this core collection are categorized as landraces or cultivars/breeding lines and were obtained from more than 100 countries. Both STRUCTURE and principal component analysis identified five major subpopulations within the core collection, mainly differentiated by geographical origin and spike row number (an inflorescence architecture trait). Different patterns of linkage disequilibrium (LD) were found across the barley genome and many regions of high LD contained traits involved in domestication and breeding selection. The genotype data were used to define 'mini-core' sets of accessions capturing the majority of the allelic diversity present in the core collection. These 'mini-core' sets can be used for evaluating traits that are difficult or expensive to score. Genome-wide association studies (GWAS) of 'hull cover', 'spike row number', and 'heading date' demonstrate the utility of the core collection for locating genetic factors determining important phenotypes. The GWAS results were referenced to a new barley consensus map containing 5,665 SNPs. Our results demonstrate that GWAS and high-density SNP genotyping are effective tools for plant breeders interested in accessing genetic diversity in large germplasm collections

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Assessing for genetic and environmental effects on ruminant feed quality in barley (Hordeum vulgare)

    No full text
    Grain samples from a combined intermediate and advanced stage barley breeding trial series, grown at two sites in two consecutive years were assessed for detailed grain quality and ruminant feed quality. The results indicated that there were significant genetic and environmental effects for "feed" traits as measured using grain hardness, acid detergent fibre (ADF), starch and in-sacco dry matter digestibility (ISDMD) assays. In addition, there was strong genotypic discrimination for the regressed feed performance traits, namely Net Energy (NE) and Average Daily Gain (ADG). There was considerable variation in genetic correlations for all traits based on variance from the cultivars used, sites or laboratory processing effects. There was a high level of heritability ranging from 89% to 88% for retention, 60% to 80% for protein and 56% to 68% for ADF. However, there were only low to moderate levels of heritability for the feed traits, with starch 30-39%, ISDMD 55-63%, ADF 56-68%, particle size 47-73%, 31-48% NE and ADG 44-51%. These results suggest that there were real differences in the feed performance of barleys and that selection for cattle feed quality is potentially a viable option for breeding programs

    Crops that feed the world 4. Barley: a resilient crop?:Strengths and weaknesses in the context of food security

    No full text
    corecore