6 research outputs found

    Integrated Management Practices for Establishing Upland Switchgrass Varieties

    Get PDF
    Establishment of switchgrass (Panicum virgatum L.) is challenging, and failure in establishment may expose growers to considerable economic risk. The objectives of this research were to (i) evaluate whether management practices are variety-specific for the establishment of switchgrass and (ii) assess the effectiveness of cover crops as preceding crops on ‘Shawnee’ switchgrass establishment. Therefore, two studies were conducted at the University of Massachusetts Agricultural Experiment Station in Deerfield, MA, USA, in the 2011–2012 and 2012–2013 growing seasons. In Experiment 1, cover crop treatments (fallow, oat (Avena sativa L.) and rye (Secale cereale L.)) were the main plots, the two seeding methods (no-till drill and a cultipacker seeder (Brillion)) were the sub-plots, and the two varieties (‘Cave-in-Rock’ (CIR) and Shawnee)) were the sub-sub-plots. The second study was conducted using Shawnee switchgrass and involved the three cover crop treatments used in Experiment 1 using a cultipacker seeder with seed firming prior to planting but not afterwards (consistent in both experiments). The results indicated that a combination of oat and no-till resulted in higher tiller density (493%), lower weed biomass (77%), increased switchgrass biomass (SGB) (283%) and SGB to weed biomass (WB) ratio. Compared with Shawnee, CIR planted into a winter-killed oat residue had higher tiller density (93%), lower weed biomass (18%), higher switchgrass yield (128%) and thus a greater SGB:WB ratio (507%). Trends of switchgrass response to management practices, however, were similar between the two varieties, indicating that seed quality rather than management practices could influence switchgrass’s response to management practices. In Experiment 2, Shawnee tiller density was suppressed by rye as the preceding crop, possibly due to late termination of rye. Shawnee switchgrass yields were below 1000 kg ha−1 under all management practices; thus, harvesting should happen in the year following establishment. Future research should focus on comparing no-till drilling with cultipacker seeder with rolling not only before but after seeding to increase seed–soil contac

    Intercropping annual medic (Medicago scutellata L.) with barley (Hordeum vulgare L.) may improve total forage and crude protein yield in semi-arid environment

    No full text
    Abstract In arid and semi-arid conditions, production of high-yielding quality forage is still a challenge. Intercropping of cereals with annual forage legumes may improve forage yield and increase on-farm protein production. A two-year field experiment was conducted during the growing seasons of 2009 and 2010 at the experimental farm of University of Tehran, Iran to determine whether intercropping of barley (Hordeum vulgare L.) and annual medic (Medicago scutellata L.) could produce sufficient amount of forage with higher protein content. A four-replicated randomized complete block design with eight cropping patterns [1B:1M (one row of barley: one row of annual medic), 2B:2M, 4B:4M, 6B:6M, 6B:2M, 4B:2M, 2B:4M, and 2B:6M] along with pure stands of barley and annual medic was implemented. Land equivalent ratio (LER) was the highest (1.19) when barley was intercropped with annual medic in 1B:1M arrangement indicating that 19% more land area would be required by a sole cropping system to produce similar yield in intercropping system. Calculated partial LER, aggresivity (A) and competitive ratio (CR) indicated that barley was the dominant species in most of the barley-annual medic cropping patterns. Based on results from LER, system productivity index (SPI) and monetary advantage index (MAI), it was concluded that 1B:1M cropping pattern was superior to either barley or annual medic monocropping. The results of this study revealed that the total protein yield of barley and annual medic forage in the selected intercropping patterns specifically 1B:1M could be enhanced while the total harvested dry matter remained unchanged

    Effect of Wheat Cover Crop and Split Nitrogen Application on Corn Yield and Nitrogen Use Efficiency

    No full text
    Corn (Zea mays L.) grain is a major commodity crop in Illinois and its production largely relies on timely application of nitrogen (N) fertilizers. Currently, growers in Illinois and other neighboring states in the U.S. Midwest use the maximum return to N (MRTN) decision support system to predict corn N requirements. However, the current tool does not factor in implications of integrating cover crops into the rotation, which has recently gained attention among growers due to several ecosystem services associated with cover cropping. A two-year field trail was conducted at the Agronomy Research Center in Carbondale, IL in 2018 and 2019 to evaluate whether split N application affects nitrogen use efficiency (NUE) of corn with and without a wheat (Triticum aestivum L.) cover crop. A randomized complete block design with split plot arrangements and four replicates was used. Main plots were cover crop treatments (no cover crop (control) compared to a wheat cover crop) and subplots were N timing applications to the corn: (1) 168 kg N ha−1 at planting; (2) 56 kg N ha−1 at planting + 112 kg N ha−1 at sidedress; (3) 112 kg N ha−1 at planting + 56 kg N ha−1 at sidedress; and (4) 168 kg N ha−1 at sidedress along with a zero-N control as check plot. Corn yield was higher in 2018 than 2019 reflecting more timely precipitation in that year. In 2018, grain yield declined by 12.6% following the wheat cover crop compared to no cover crop control, indicating a yield penalty when corn was preceded with a wheat cover crop. In 2018, a year with timely and sufficient rainfall, there were no yield differences among N treatments and N balances were near zero. In 2019, delaying the N application improved NUE and corn grain yield due to excessive rainfall early in the season reflecting on N losses which was confirmed by lower N balances in sidedressed treatments. Overall, our findings suggest including N credit for cereals in MRTN prediction model could help with improved N management in the Midwestern United States

    Orchard recycling improves climate change adaptation and mitigation potential of almond production systems.

    No full text
    There is an urgent need to develop climate smart agroecosystems capable of mitigating climate change and adapting to its effects. In California, high commodity prices and increased frequency of drought have encouraged orchard turnover, providing an opportunity to recycle tree biomass in situ prior to replanting an orchard. Whole orchard recycling (WOR) has potential as a carbon (C) negative cultural practice to build soil C storage, soil health, and orchard productivity. We tested the potential of this practice for long term C sequestration and hypothesized that associated co-benefits to soil health will enhance sustainability and resiliency of almond orchards to water-deficit conditions. We measured soil health metrics and productivity of an almond orchard following grinding and incorporation of woody biomass vs. burning of old orchard biomass 9 years after implementation. We also conducted a deficit irrigation trial with control and deficit irrigation (-20%) treatments to quantify shifts in tree water status and resilience. Biomass recycling led to higher yields and substantial improvement in soil functioning, including nutrient content, aggregation, porosity, and water retention. This practice also sequestered significantly higher levels of C in the topsoil (+5 t ha-1) compared to burning. We measured a 20% increase in irrigation water use efficiency and improved soil and tree water status under stress, suggesting that in situ biomass recycling can be considered as a climate smart practice in California irrigated almond systems
    corecore