261 research outputs found

    Maintaining Seed Quality of Maize and Wheat through Dry Chain Technology in Pakistan

    Get PDF
    Seed is inevitably deteriorated during storage and higher seed moisture content is the primary cause of this decline in seed quality. Dry Chain is a valuable tool, by using moisture proof hermetic containers to preserve seed quality throughout supply chain. This study evaluated and compared the performance of wheat and maize seed in different hermetic storage packaging (Super bag, Anaaji bag and drum) with conventionally used woven polypropylene bags after six months storage in ambient conditions. Seed moisture content was increased up to 11.53 and 13.55% in wheat and maize respectively when packed in polypropylene bags while it remained low (approximately 10 and 11.4% in wheat and maize respectively) when packed in hermetically sealed bags and drum. Germination was maintained in both cereal seeds stored in hermetically sealed Super bag, anaaji bag and drum while it reduced in polypropylene bags as compared to initial seed quality. Seed stored in polypropylene bag deteriorated quickly, which resulted in loss of seed vigour as indicated by higher malondialdehyde contents and electrical conductivity of seed leachates. It can be concluded that maintenance of seed dryness with hermetic storage is useful in preservation of seed quality and related attributes under high relative humidity environment

    Frequency and Practice-Level Variation in Inappropriate Aspirin Use for the Primary Prevention of Cardiovascular Disease Insights From the National Cardiovascular Disease Registry’s Practice Innovation and Clinical Excellence Registry

    Get PDF
    AbstractBackgroundAmong patients without cardiovascular disease (CVD) and low 10-year CVD risk, the risks of gastrointestinal bleeding and hemorrhagic strokes associated with aspirin use outweigh any potential atheroprotective benefit. According to the guidelines on primary prevention of CVD, aspirin use is considered appropriate only in patients with 10-year CVD risk ≥6% and inappropriate in patients with 10-year CVD risk <6%.ObjectivesThe goal of this study was to examine the frequency and practice-level variation in inappropriate aspirin use for primary prevention in a large U.S. nationwide registry.MethodsWithin the National Cardiovascular Disease Registry’s Practice Innovation and Clinical Excellence registry, we assessed 68,808 unique patients receiving aspirin for primary prevention from 119 U.S. practices. The frequency of inappropriate aspirin use was determined for primary prevention (aspirin use in those with 10-year CVD risk <6%). Using hierarchical regression models, the extent of practice-level variation using the median rate ratio (MRR) was assessed.ResultsInappropriate aspirin use frequency was 11.6% (7,972 of 68,808) in the overall cohort. There was significant practice-level variation in inappropriate use (range 0% to 71.8%; median 10.1%; interquartile range 6.4%) for practices; adjusted MRR was 1.63 (95% confidence interval [CI]: 1.47 to 1.77). Results remained consistent after excluding 21,052 women age ≥65 years (inappropriate aspirin use 15.2%; median practice-level inappropriate aspirin use 13.8%; interquartile range 8.2%; adjusted MRR 1.61 [95% CI: 1.46 to 1.75]) and after excluding patients with diabetes (inappropriate aspirin use 13.9%; median practice-level inappropriate aspirin use 12.4%; interquartile range 7.6%; adjusted MRR 1.55 [95% CI: 1.41 to 1.67]).ConclusionsMore than 1 in 10 patients in this national registry were receiving inappropriate aspirin therapy for primary prevention, with significant practice-level variations. Our findings suggest that there are important opportunities to improve evidence-based aspirin use for the primary prevention of CVD

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Genetic analysis indicate superiority of perfomance of cape goosberry (Physalis peruviana L.) hybrids

    Full text link
    The use of hybrids as a new type of cape gooseberry (Physalis peruviana L.) cultivars could improve yield in this crop, but little or no information is available on hybrid perfomance. We studied several vegetative characters, yield, fruit weight and fruit shape, soluble solids content (SSC), titratable acidity (TA) and ascorbic acid content (AAC) in three hybrids of cape gooseberry and their parents grown outdoors and in a glasshouse. The highest yields were obtained with hybrids, specially in a glasshouse. Interaction dominance environment for yield was very important; a higher dominance effect was detected in the glasshouse, than that observed outdoors. Quality characters were highly affected by the environment and showed variable results for the different families. For fruit composition traits, the additive and additive environment interactions were most important. Broad-sense heritability for all characters was high to medium (0.48-0.91), indicating that a high response to selection would be expected. Hybrids can improve cape gooseberry yield without impairing fruit quality.Leiva-Brondo, M.; Prohens Tomás, J.; Nuez Viñals, F. (2001). Genetic analysis indicate superiority of perfomance of cape goosberry (Physalis peruviana L.) hybrids. Journal of New Seeds. 3(3):71-84. doi:10.1300/J153v03n03_04718433Abak, K., Güler, H. Y., Sari, N., & Paksoy, M. (1994). EARLINESS AND YIELD OF PHYSALIS (P. IXOCARPA BROT. AND P. PERUVIANA L.) IN GREENHOUSE, LOW TUNNEL AND OPEN FIELD. Acta Horticulturae, (366), 301-306. doi:10.17660/actahortic.1994.366.37Kang, M. S. (1997). Using Genotype-by-Environment Interaction for Crop Cultivar Development. Advances in Agronomy Volume 62, 199-252. doi:10.1016/s0065-2113(08)60569-6Klinac, D. J. (1986). Cape gooseberry (Physalis peruviana) production systems. New Zealand Journal of Experimental Agriculture, 14(4), 425-430. doi:10.1080/03015521.1986.10423060Mather, K., & Jinks, J. L. (1977). Introduction to Biometrical Genetics. doi:10.1007/978-94-009-5787-9Mazer, S. J., & Schick, C. T. (1991). Constancy of population parameters for life history and floral traits in Raphanus sativus L. I. Norms of reaction and the nature of genotype by environment interactions. Heredity, 67(2), 143-156. doi:10.1038/hdy.1991.74Nyquist, W. E., & Baker, R. J. (1991). Estimation of heritability and prediction of selection response in plant populations. Critical Reviews in Plant Sciences, 10(3), 235-322. doi:10.1080/07352689109382313Pearcy, R. W. (1990). Sunflecks and Photosynthesis in Plant Canopies. Annual Review of Plant Physiology and Plant Molecular Biology, 41(1), 421-453. doi:10.1146/annurev.pp.41.060190.002225Péron, J. Y., Demaure, E., & Hannetel, C. (1989). POSSIBILITIES OF TROPICAL SOLANACEAE AND CUCURBITACEAE INTRODUCTION IN FRANCE. Acta Horticulturae, (242), 179-186. doi:10.17660/actahortic.1989.242.24Proctor, F. J. (1990). THE EUROPEAN COMMUNITY MARKET FOR TROPICAL FRUIT AND FACTORS LIMITING GROWTH. Acta Horticulturae, (269), 29-40. doi:10.17660/actahortic.1990.269.

    Fitness Tradeoffs of Antibiotic Resistance in Extraintestinal Pathogenic Escherichia coli

    Get PDF
    Evolutionary trade-offs occur when selection on one trait has detrimental effects on other traits. In pathogenic microbes, it has been hypothesized that antibiotic resistance trades off with fitness in the absence of antibiotic. Although studies of single resistance mutations support this hypothesis, it is unclear whether trade-offs are maintained over time, due to compensatory evolution and broader effects of genetic background. Here, we leverage natural variation in 39 extraintestinal clinical isolates of Escherichia coli to assess trade-offs between growth rates and resistance to fluoroquinolone and cephalosporin antibiotics. Whole-genome sequencing identifies a broad range of clinically relevant resistance determinants in these strains. We find evidence for a negative correlation between growth rate and antibiotic resistance, consistent with a persistent trade-off bet

    BNP and obesity in acute decompensated heart failure with preserved vs. reduced ejection fraction: The Atherosclerosis Risk in Communities Surveillance Study

    Get PDF
    Background Levels of B-type natriuretic peptide (BNP), a prognostic marker in patients with heart failure (HF), are lower among HF patients with obesity or preserved Left Ventricular Ejection Fraction (LVEF). We examined the distribution and prognostic value of BNP across BMI categories in acute decompensated heart failure (ADHF) patients with preserved vs. reduced LVEF. Methods We analyzed data from the Atherosclerosis Risk in Communities (ARIC) HF surveillance study which sampled and adjudicated ADHF hospitalizations in patients aged ≥ 55 years from 4 US communities (2005–2009). We examined 5 BMI categories: underweight (< 18.5 kg/m2), normal weight (18.5–<25), overweight (25–<30), obese (30–<40) and morbidly obese (≥ 40) in HF with preserved LVEF (HFpEF) and reduced LVEF (HFrEF). The outcome was 1-year mortality from admission. We used ANCOVA to model log BNP and logistic regression for 1-year mortality, both adjusted for demographics and clinical characteristics. Results The cohort included 9820 weighted ADHF hospitalizations (58% HFrEF; 42% HFpEF). BNP levels were lower in HFpEF compared to HFrEF (p < 0.001) and decreased as BMI increased within the LVEF groups (p < 0.001). After adjustment for covariates, log10 BNP independently predicted 1-year mortality (adjusted OR 1.62 (95% CI 1.17–2.24)) with no significant interaction by BMI or LVEF groups. Conclusions BNP levels correlated inversely with BMI, and were higher in HFrEF compared to HFpEF. Obese patients with HFpEF and ADHF had a significant proportion with BNP levels below clinically accepted thresholds. Nevertheless, BNP was a predictor of mortality in ADHF across groups of BMI in HFpEF and HFrEF
    corecore