160 research outputs found
Managing Diseases of Alfalfa
Alfalfa can be a vigorous and productive forage crop for Kentucky farmers. Like all farm crops, however, alfalfa is subject to infectious diseases that can limit forage production. Managing these diseases is an important part of economical alfalfa production.
Alfalfa diseases can cause reduced forage yield, reduced forage quality, and decreased stand persistence. Sometimes, the effects of infectious diseases can be dramatic, such as sudden stand loss in a fall-seeded crop caused by Sclerotinia crown and stem rot. Often, the effects of diseases on alfalfa are more subtle but no less important. For example, alfalfa plants with Phytophthora root rot sometimes regrow slowly following cutting, resulting in a stunted stand showing no other obvious symptoms of disease.
Other diseases like bacterial wilt can kill a few scattered plants between each cutting. Over a period of a few seasons, this can result in a gradual yet substantial loss in plant stand. Also, several diseases predispose alfalfa to winter injury, such as crown rot diseases and Phytophthora root rot. In some cases, stand loss during the winter may be blamed on winterkill when an infectious disease ultimately may be involved
Advances in Alfalfa Variety Development and Testing
Alfalfa (Medicago sativa) is historically the highest yielding, highest quality forage legume grown in Kentucky. It forms the basis of Kentucky\u27s cash hay enterprise and is an important component in dairy, horse, beef and sheep diets. Over 300,000 acres of alfalfa are grown annually in Kentucky, with state yields averaging between 3 and 4 tons per acre.
The development and testing of alfalfa varieties is a dynamic process that impacts all Kentucky farmers. The Kentucky Alfalfa Variety Testing program was re-started in 1990 and is carried out through the efforts of several people, including Leonard Lauriault, Linda Brown (Western Kentucky University), Garry Lacefield, Paul Vincelli, and John Parr. Alfalfa varieties are being studied for yield in 6 plot studies over 3 locations (Lexington, Bowling Green, and Princeton). Other research being conducted include the effect of aphanomyces root rot on variety yield and persistence and the effect of variety on forage quality
1994 Kentucky Bluegrass Variety Test Report
Kentucky bluegrass (Poapratensis) is the third most prominent cool-season grass used in Kentucky for forage, behind tall fescue and orchardgrass. As with all cool-season grasses, Kentucky bluegrass does best in cooler weather, becoming relatively non-productive in hot, dry conditions. It is a high quality, long-lived, rhizomatous grass that is used for both turf and forage. Compared to other cool-season grasses, Kentucky bluegrass is slower to germinate (2-3 weeks) and generally is lower in seedling vigor and herbage yield. Most recent varieties have been developed for turf use. Several have been used in horse pastures even though they were not developed for forage use because Kentucky bluegrass is a low growing species that is tolerant of close grazing by horses. It is highly palatable to horses and has no known toxicities. In horse pastures, Kentucky bluegrass grows well with white clover, a low growing, grazing-tolerant legume, that is also a favorite of horse pasture managers. While it is more suited for use by grazing animals, Kentucky bluegrass may be harvested as hay. Management is similar to that for other cool-season grasses
Aphanomyces-Resistant Alfalfa: A Solution to a Common Problem in Spring Seedings
For several decades, farmers have experienced a common stand-establishment disease syndrome when spring-seeded alfalfa was followed by extended periods of wet weather. Seedlings affected by this syndrome exhibit severe stunting as well as yellowing and reddening of seed leaves (cotyledons), but they do not wilt or collapse, as they might from a damping-off disease. Commonly, the problem affects most or all of the field.
Based on research that began in the 1980\u27s, we suspected that a fungus called Aphanomyces euteiches (hereafter simply called Aphanomyces) was responsible. This root-rot fungus can be found in the majority of alfalfa fields we have sampled in central and western Kentucky. However, for many years we lacked conclusive proof that Aphanomyces was, in fact, the cause of this common problem in spring-seeded alfalfa. We also did not have rigorous proof that the syndrome could be avoided by sowing Aphanomyces-resistant alfalfa varieties, which started becoming commercially available in the early 1990\u27s. In this report, we provide a brief summary of research to support our new recommendation: that spring-seeded alfalfa should be sown only with varieties having an R or HR rating to Aphanomyces root rot (ARR)
Meta-Analysis of Yield Response of Hybrid Field Corn to Foliar Fungicides in the U.S. Corn Belt
The use of foliar fungicides on field corn has increased greatly over the past 5 years in the United States in an attempt to increase yields, despite limited evidence that use of the fungicides is consistently profitable. To assess the value of using fungicides in grain corn production, random-effects meta-analyses were performed on results from foliar fungicide experiments conducted during 2002 to 2009 in 14 states across the United States to determine the mean yield response to the fungicides azoxystrobin, pyraclostrobin, propiconazole + trifloxystrobin, and propiconazole + azoxystrobin. For all fungicides, the yield difference between treated and nontreated plots was highly variable among studies. All four fungicides resulted in a significant mean yield increase relative to the nontreated plots (P \u3c 0.05). Mean yield difference was highest for propiconazole + trifloxystrobin (390 kg/ha), followed by propiconazole + azoxystrobin (331 kg/ha) and pyraclostrobin (256 kg/ha), and lowest for azoxystrobin (230 kg/ha). Baseline yield (mean yield in the nontreated plots) had a significant effect on yield for propiconazole + azoxystrobin (P \u3c 0.05), whereas baseline foliar disease severity (mean severity in the nontreated plots) significantly affected the yield response to pyraclostrobin, propiconazole + trifloxystrobin, and propiconazole + azoxystrobin but not to azoxystrobin. Mean yield difference was generally higher in the lowest yield and higher disease severity categories than in the highest yield and lower disease categories. The probability of failing to recover the fungicide application cost (ploss) also was estimated for a range of grain corn prices and application costs. At the 10-year average corn grain price of 2.97/bushel) and application costs of $40 to 95/ha, ploss for disease severity \u3c5% was 0.55 to 0.98 for pyraclostrobin, 0.62 to 0.93 for propiconazole + trifloxystrobin, 0.58 to 0.89 for propiconazole + azoxystrobin, and 0.91 to 0.99 for azoxystrobin. When disease severity was \u3e5%, the corresponding probabilities were 0.36 to 95, 0.25 to 0.69, 0.25 to 0.64, and 0.37 to 0.98 for the four fungicides. In conclusion, the high ploss values found in most scenarios suggest that the use of these foliar fungicides is unlikely to be profitable when foliar disease severity is low and yield expectation is high
Elotuzumab plus pomalidomide and dexamethasone in relapsed/refractory multiple myeloma: a multicenter, retrospective, real-world experience with 200 cases outside of controlled clinical trials
In the ELOQUENT-3 trial, the combination of elotuzumab, pomalidomide , dexamethasone (EloPd) proved to have a superior clinical benefit over pomalidomide and dexamethasone with a manageable toxicity profile, leading to its approval for the treatment of patients with relapsed/refractory multiple myeloma (RRMM) who have received at least two prior therapies, including lenalidomide and a proteasome inhibitor. We report here a real-world experience of 200 cases of RRMM treated with EloPd in 35 Italian centers outside of clinical trials. In our dataset, the median number of prior lines of therapy was two, with 51% of cases undergoing autologous stem cell transplant and 73% having been exposed to daratumumab. After a median follow-up of 9 months, 126 patients had stopped EloPd, most of them (88.9%) because of disease progression. The overall response rate was 55.4%, a finding in line with the pivotal trial results. Regarding adverse events, the toxicity profile in our cohort was similar to that in the ELOQUENT-3 trial, with no significant differences between younger (<70 years) and older patients. The median progression-free survival was 7 months, which was shorter than that observed in ELOQUENT-3, probably because of the different clinical characteristics of the two cohorts. Interestingly, International Staging System stage III disease was associated with worse progression-free survival (hazard ratio=2.55). Finally, the median overall survival of our series was shorter than that observed in the ELOQUENT-3 trial (17.5 vs. 29.8 months). In conclusion, our real-world study confirms that EloPd is a safe and possible therapeutic choice for patients with RRMM who have received at least two prior therapies, including lenalidomide and a proteasome inhibitor
Elevated lactate dehydrogenase has prognostic relevance in treatment-na\uefve patients affected by chronic lymphocytic leukemia with trisomy 12
Chronic Lymphocytic Leukemia (CLL) patients with +12 have been reported to have specific clinical and biologic features. We performed an analysis of the association between demographic; clinical; laboratory; biologic features and outcome in CLL patients with +12 to identify parameters predictive of disease progression; time to treatment; and survival. The study included 487 treatment-naive CLL patients with +12 from 15 academic centers; diagnosed between January 2000 and July 2016; and 816 treatment-na\uefve patients with absence of Fluorescence In Situ Hybridization (FISH) abnormalities. A cohort of 250 patients with +12 CLL followed at a single US institution was used for external validation. In patients with +12; parameters associated with worse prognosis in the multivariate model were high Lactate DeHydrogenase (LDH) and \u3b2-2- microglobulin and unmutated immunoglobulin heavy-chain variable region gene (IGHV). CLL patients with +12 and high LDH levels showed a shorter Progression-Free-Survival (PFS) (30 months vs. 65 months; p < 0.001), Treatment-Free-Survival (TFS) (33 months vs. 69 months; p < 0.001), Overall Survival (OS) (131 months vs. 181 months; p < 0.001) and greater CLL-related mortality (29% vs. 11% at 10 years; p < 0.001) when compared with +12 CLL patients with normal LDH levels. The same differences were observed in the validation cohort. These data suggest that serum LDH levels can predict PFS; TFS; OS and CLL-specific survival in CLL patients with +12
Choice of Frontline Tyrosine-Kinase Inhibitor and Early Events in Very Elderly Patients With Chronic Myeloid Leukemia in Chronic Phase: A "Campus CML" Study
Objectives: The study aimed to evaluate the utilization of frontline TKI therapy in a large cohort of elderly CP-CML patients. Methods: A retrospective analysis was conducted on 332 CP-CML patients aged 75 years or older among 1929 diagnosed from January 2012 to December 2019 followed at 36 participating Hematology Centers involved in the "Campus CML" project. Results: Among the patients analyzed, 85.8% received imatinib (IM) while 14.2% received second-generation TKIs (2G-TKI), 59.5% dasatinib, and 40.5% nilotinib. Most patients initiated IM at standard dose (67.3%) while 32.7% at reduced dose. A similar trend was observed with 2G-TKIs. The cumulative incidence of permanent TKI discontinuation at 12 months was 28.4%, primarily due to primary resistance (10.1%) and extra-hematologic toxicity (9.5%), with no significant difference between IM and 2G-TKI groups. Following the introduction of generic IM in Italy in 2018, IM usage increased significantly compared with 2G-TKIs. Conclusions: IM was in our Centers the preferred frontline therapy for older CP-CML patients, with increasing utilization after the introduction of generic formulations. However, 2G-TKIs are still used in a substantial proportion of patients, suggesting individualized physician assessments regarding patient suitability and expectations. Further investigation is needed to assess efficacy and safety of reduced TKI doses in this patient population
- …