421 research outputs found

    The role of genotype and production environment in determining the cooking time of dry beans (\u3ci\u3ePhaseolus vulgaris\u3c/i\u3e L.)

    Get PDF
    Dry bean (Phaseolus vulgaris L.) is a nutrient‐dense food rich in proteins and minerals. Although a dietary staple in numerous regions, including Eastern and Southern Africa, greater utilization is limited by its long cooking time as compared with other staple foods. A fivefold genetic variability for cooking time has been identified for P. vulgaris, and to effectively incorporate the cooking time trait into bean breeding programs, knowledge of how genotypes behave across diverse environments is essential. Fourteen bean genotypes selected from market classes important to global consumers (yellow, cranberry, light red kidney, red mottled, and brown) were grown in 10 to 15 environments (combinations of locations, years, and treatments), and their cooking times were measured when either presoaked or unsoaked prior to boiling. The 15 environments included locations in North America, the Caribbean, and Eastern and Southern Africa that are used extensively for dry bean breeding. The cooking times of the 14 presoaked dry bean genotypes ranged from 16 to 156 min, with a mean of 86 min across the 15 production environments. The cooking times of the 14 dry bean genotypes left unsoaked ranged from 77 to 381 min, with a mean cooking time of 113 min. The heritability of the presoaked cooking time was very high (98%) and moderately high for the unsoaked cooking time (~60%). The genotypic cooking time patterns were stable across environments. There was a positive correlation between the presoaked and unsoaked cooking times (r = .64, p \u3c 0.0001), and two of the fastest cooking genotypes when presoaked were also the fastest cooking genotypes when unsoaked (G1, Cebo, yellow bean; and G4, G23086, cranberry bean). Given the sufficient genetic diversity found, limited crossover Genotype × Environment interactions, and high heritability for cooking time, it is feasible to develop fast cooking dry bean varieties without the need for extensive testing across environments

    The role of genotype and production environment in determining the cooking time of dry beans (\u3ci\u3ePhaseolus vulgaris\u3c/i\u3e L.)

    Get PDF
    Dry bean (Phaseolus vulgaris L.) is a nutrient‐dense food rich in proteins and minerals. Although a dietary staple in numerous regions, including Eastern and Southern Africa, greater utilization is limited by its long cooking time as compared with other staple foods. A fivefold genetic variability for cooking time has been identified for P. vulgaris, and to effectively incorporate the cooking time trait into bean breeding programs, knowledge of how genotypes behave across diverse environments is essential. Fourteen bean genotypes selected from market classes important to global consumers (yellow, cranberry, light red kidney, red mottled, and brown) were grown in 10 to 15 environments (combinations of locations, years, and treatments), and their cooking times were measured when either presoaked or unsoaked prior to boiling. The 15 environments included locations in North America, the Caribbean, and Eastern and Southern Africa that are used extensively for dry bean breeding. The cooking times of the 14 presoaked dry bean genotypes ranged from 16 to 156 min, with a mean of 86 min across the 15 production environments. The cooking times of the 14 dry bean genotypes left unsoaked ranged from 77 to 381 min, with a mean cooking time of 113 min. The heritability of the presoaked cooking time was very high (98%) and moderately high for the unsoaked cooking time (~60%). The genotypic cooking time patterns were stable across environments. There was a positive correlation between the presoaked and unsoaked cooking times (r = .64, p \u3c 0.0001), and two of the fastest cooking genotypes when presoaked were also the fastest cooking genotypes when unsoaked (G1, Cebo, yellow bean; and G4, G23086, cranberry bean). Given the sufficient genetic diversity found, limited crossover Genotype × Environment interactions, and high heritability for cooking time, it is feasible to develop fast cooking dry bean varieties without the need for extensive testing across environments

    Cytomegalovirus (CMV) Disease Despite Weekly Preemptive CMV Strategy for Recipients of Solid Organ and Hematopoietic Stem Cell Transplantation

    Get PDF
    BACKGROUND: Transplant recipients presenting with cytomegalovirus (CMV) disease at the time of diagnosis of CMV DNAemia pose a challenge to a preemptive CMV management strategy. However, the rate and risk factors of such failure remain uncertain. METHODS: Solid organ transplantation (SOT) and hematopoietic stem cell transplantation (HSCT) recipients with a first episode of CMV polymerase chain reaction (PCR) DNAemia within the first year posttransplantation were evaluated (n = 335). Patient records were reviewed for presence of CMV disease at the time of CMV DNAemia diagnosis. The distribution and prevalence of CMV disease were estimated, and the odds ratio (OR) of CMV disease was modeled using logistic regression. RESULTS: The prevalence of CMV disease increased for both SOT and HSCT with increasing diagnostic CMV PCR load and with screening intervals >14 days. The only independent risk factor in multivariate analysis was increasing CMV DNAemia load of the diagnostic CMV PCR (OR = 6.16; 95% confidence interval, 2.09–18.11). Among recipients receiving weekly screening (n = 147), 16 (10.8%) had CMV disease at the time of diagnosis of CMV DNAemia (median DNAemia load 628 IU/mL; interquartile range, 432–1274); 93.8% of these cases were HSCT and lung transplant recipients. CONCLUSIONS: Despite application of weekly screening intervals, HSCT and lung transplant recipients in particular presented with CMV disease at the time of diagnosis of CMV DNAemia. Additional research to improve the management of patients at risk of presenting with CMV disease at low levels of CMV DNAemia and despite weekly screening is warranted

    Risk Factors for Failure of Primary (Val)ganciclovir Prophylaxis Against Cytomegalovirus Infection and Disease in Solid Organ Transplant Recipients

    Get PDF
    Background: Rates and risk factors for cytomegalovirus (CMV) prophylaxis breakthrough and discontinuation were investigated, given uncertainty regarding optimal dosing for CMV primary (val)ganciclovir prophylaxis after solid organ transplantation (SOT). Methods: Recipients transplanted from 2012 to 2016 and initiated on primary prophylaxis were followed until 90 days post-transplantation. A (val)ganciclovir prophylaxis score for each patient per day was calculated during the follow-up time (FUT; score of 100 corresponding to manufacturers' recommended dose for a given estimated glomerular filtration rate [eGFR]). Cox models were used to estimate hazard ratios (HRs), adjusted for relevant risk factors. Results: Of 585 SOTs (311 kidney, 117 liver, 106 lung, 51 heart) included, 38/585 (6.5%) experienced prophylaxis breakthrough and 35/585 (6.0%) discontinued prophylaxis for other reasons. CMV IgG donor+/receipient- mismatch (adjusted HR [aHR], 5.37; 95% confidence interval [CI], 2.63 to 10.98; P < 0.001) and increasing % FUT with a prophylaxis score <90 (aHR, 1.16; 95% CI, 1.04 to 1.29; P = .01 per 10% longer FUT w/ score <90) were associated with an increased risk of breakthrough. Lung recipients were at a significantly increased risk of premature prophylaxis discontinuation (aHR, 20.2 vs kidney; 95% CI, 3.34 to 121.9; P = .001), mainly due to liver or myelotoxicity. Conclusions: Recipients of eGFR-adjusted prophylaxis doses below those recommended by manufacturers were at an increased risk of prophylaxis breakthrough, emphasizing the importance of accurate dose adjustment according to the latest eGFR and the need for novel, less toxic agents

    Earliest Triassic microbialites in the South China Block and other areas; controls on their growth and distribution

    Get PDF
    Earliest Triassic microbialites (ETMs) and inorganic carbonate crystal fans formed after the end-Permian mass extinction (ca. 251.4 Ma) within the basal Triassic Hindeodus parvus conodont zone. ETMs are distinguished from rarer, and more regional, subsequent Triassic microbialites. Large differences in ETMs between northern and southern areas of the South China block suggest geographic provinces, and ETMs are most abundant throughout the equatorial Tethys Ocean with further geographic variation. ETMs occur in shallow-marine shelves in a superanoxic stratified ocean and form the only widespread Phanerozoic microbialites with structures similar to those of the Cambro-Ordovician, and briefly after the latest Ordovician, Late Silurian and Late Devonian extinctions. ETMs disappeared long before the mid-Triassic biotic recovery, but it is not clear why, if they are interpreted as disaster taxa. In general, ETM occurrence suggests that microbially mediated calcification occurred where upwelled carbonate-rich anoxic waters mixed with warm aerated surface waters, forming regional dysoxia, so that extreme carbonate supersaturation and dysoxic conditions were both required for their growth. Long-term oceanic and atmospheric changes may have contributed to a trigger for ETM formation. In equatorial western Pangea, the earliest microbialites are late Early Triassic, but it is possible that ETMs could exist in western Pangea, if well-preserved earliest Triassic facies are discovered in future work

    Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE) Conceptual Design Report Volume 2: The Physics Program for DUNE at LBNF

    Full text link
    The Physics Program for the Deep Underground Neutrino Experiment (DUNE) at the Fermilab Long-Baseline Neutrino Facility (LBNF) is described

    Should Controls With Respiratory Symptoms Be Excluded From Case-Control Studies of Pneumonia Etiology? Reflections From the PERCH Study.

    Get PDF
    Many pneumonia etiology case-control studies exclude controls with respiratory illness from enrollment or analyses. Herein we argue that selecting controls regardless of respiratory symptoms provides the least biased estimates of pneumonia etiology. We review 3 reasons investigators may choose to exclude controls with respiratory symptoms in light of epidemiologic principles of control selection and present data from the Pneumonia Etiology Research for Child Health (PERCH) study where relevant to assess their validity. We conclude that exclusion of controls with respiratory symptoms will result in biased estimates of etiology. Randomly selected community controls, with or without respiratory symptoms, as long as they do not meet the criteria for case-defining pneumonia, are most representative of the general population from which cases arose and the least subject to selection bias

    Incidence Rates and Risk Factors of Clostridioides difficile Infection in Solid Organ and Hematopoietic Stem Cell Transplant Recipients

    Get PDF
    BACKGROUND: Transplant recipients are an immunologically vulnerable patient group and are at elevated risk of Clostridioides difficile infection (CDI) compared with other hospitalized populations. However, risk factors for CDI post-transplant are not fully understood. // METHODS: Adults undergoing solid organ (SOT) and hematopoietic stem cell transplant (HSCT) from January 2010 to February 2017 at Rigshospitalet, University of Copenhagen, Denmark, were retrospectively included. Using nationwide data capture of all CDI cases, the incidence and risk factors of CDI were assessed. // RESULTS: A total of 1687 patients underwent SOT or HSCT (1114 and 573, respectively), with a median follow-up time (interquartile range) of 1.95 (0.52–4.11) years. CDI was diagnosed in 15% (164) and 20% (114) of the SOT and HSCT recipients, respectively. CDI rates were highest in the 30 days post-transplant for both SOT and HSCT (adjusted incidence rate ratio [aIRR], 6.64; 95% confidence interval [CI], 4.37–10.10; and aIRR, 2.85; 95% CI, 1.83–4.43, respectively, compared with 31–180 days). For SOT recipients, pretransplant CDI and liver and lung transplant were associated with a higher risk of CDI in the first 30 days post-transplant, whereas age and liver transplant were risk factors in the later period. Among HSCT recipients, myeloablative conditioning and a higher Charlson Comorbidity Index were associated with a higher risk of CDI in the early period but not in the late period. // CONCLUSIONS: Using nationwide data, we show a high incidence of CDI among transplant recipients. Importantly, we also find that risk factors can vary relative to time post-transplant
    • …
    corecore