6 research outputs found
Recommended from our members
Cover Crop and Nitrogen Fertilizer Management for Potato Production in the Northeast
Potato (Solanum tuberosum L.) rates fourth among the world’s agricultural products in terms of production volume and human consumption and worldwide demand for potatoes will exceed that of rice, wheat, or corn by 2020. Potato consumption has been a major part of the North American diet since early in the 17th century and as a dominant arable crop in the Northeastern United States. There are over 2700 potato fields in the Northeast United States and potato growers often over apply nitrogen (N) fertilizer to ensure against loss of yield. High mobility of nitrate form N fertilization in the soil profile makes it susceptible to leach to the lower soil levels leading to ground water nitrate contamination, other environmental concerns, and increased costs of production.
Rye (Secale cereale L.) is the most widely grown cover crop in the Northeast U.S, and its N-scavenging capacity and adaptability to the soils and climates in the region have been well documented. However, it might not be an adequate source of N for the early planted cash crops in the spring because it is not given the opportunity to grow in the spring and accumulate substantial amount of biomass. Therefore, we implemented field experiments to evaluate whether forage radish (Raphanus sativus L.) or winter peas (Pisum sativum L.) could be a more appropriate cover crop than rye in rotation with Dark Red Norland and Superior potatoes in Massachusetts. We also applied four levels of N fertilizer (0, 75, 150, and 225 kg ha-1) in combination with cover crops to tailoring N rates as an external source of N in addition to the released N from cover crop residues.
Our study centered on three major topics: (i) Cover crop decomposition rate and trend of nutrient release in a conventional or no till system to evaluate whether there is a synchrony with potato nutrient demands (ii) Tuber yield and nutrient density of potatoes as influenced by cover crops and N fertilization and (iii) Nitrogen use efficiency (NUE) indices, tuber quality, and pest control in potatoes.
Our results indicated that a conventional tilling system accelerated the decomposition process and also increased the rate of nutrient loss in the soil compared with a no till system. Among the cover crops used in this study, forage radish or peas accumulated more N than rye. Also, forage radish or peas with narrower C:N ratio released their N content in a faster trend. Potato tuber yield in both varieties was improved, and peas or forage radish outperformed than rye or no cover crop plots in this regard. Also, forage radish was advantageous over winter peas or rye in terms of providing nutrients other than N as suggested by more nutrient dense potatoes.
Cover crops, especially peas or forage radish were efficient in reducing N fertilization requirements in both potato varieties as indicated by higher NUE parameters. Potatoes planted after cover crops were less efficient in utilization of the supplied N than potatoes following fallow. Application of high rates of N fertilizer decreased NUE parameters through enhanced vegetative growth or probably environmental losses. Forage radish or peas exhibited more synchrony with potato N demands at its critical growth stages in terms of N release from residues. Cover crops did not produce potato tubers of higher quality than no cover crop plots. Colorado potato beetle infestation was lower in potato plants after rye early in the spring than with the other cover crops; however, later in the season all of the treatments showed the same infestation. Weed infestation tended to be lower in cover crop plots than in no cover crop plots, yet, rye and forage radish were advantageous over winter peas for suppressing weeds.
Overall, it is proposed that planting forage radish as early as possible in late August or early September could produce more potato yield and improve nutrient density than winter peas or winter rye. Also, to get the most out of the released nutrients, especially nitrogen, it is important to prepare the land and plant potatoes as early as possible in the spring
Integrated Management Practices for Establishing Upland Switchgrass Varieties
Establishment of switchgrass (Panicum virgatum L.) is challenging, and failure in establishment may expose growers to considerable economic risk. The objectives of this research were to (i) evaluate whether management practices are variety-specific for the establishment of switchgrass and (ii) assess the effectiveness of cover crops as preceding crops on ‘Shawnee’ switchgrass establishment. Therefore, two studies were conducted at the University of Massachusetts Agricultural Experiment Station in Deerfield, MA, USA, in the 2011–2012 and 2012–2013 growing seasons. In Experiment 1, cover crop treatments (fallow, oat (Avena sativa L.) and rye (Secale cereale L.)) were the main plots, the two seeding methods (no-till drill and a cultipacker seeder (Brillion)) were the sub-plots, and the two varieties (‘Cave-in-Rock’ (CIR) and Shawnee)) were the sub-sub-plots. The second study was conducted using Shawnee switchgrass and involved the three cover crop treatments used in Experiment 1 using a cultipacker seeder with seed firming prior to planting but not afterwards (consistent in both experiments). The results indicated that a combination of oat and no-till resulted in higher tiller density (493%), lower weed biomass (77%), increased switchgrass biomass (SGB) (283%) and SGB to weed biomass (WB) ratio. Compared with Shawnee, CIR planted into a winter-killed oat residue had higher tiller density (93%), lower weed biomass (18%), higher switchgrass yield (128%) and thus a greater SGB:WB ratio (507%). Trends of switchgrass response to management practices, however, were similar between the two varieties, indicating that seed quality rather than management practices could influence switchgrass’s response to management practices. In Experiment 2, Shawnee tiller density was suppressed by rye as the preceding crop, possibly due to late termination of rye. Shawnee switchgrass yields were below 1000 kg ha−1 under all management practices; thus, harvesting should happen in the year following establishment. Future research should focus on comparing no-till drilling with cultipacker seeder with rolling not only before but after seeding to increase seed–soil contac
Intercropping annual medic (Medicago scutellata L.) with barley (Hordeum vulgare L.) may improve total forage and crude protein yield in semi-arid environment
Abstract In arid and semi-arid conditions, production of high-yielding quality forage is still a challenge. Intercropping of cereals with annual forage legumes may improve forage yield and increase on-farm protein production. A two-year field experiment was conducted during the growing seasons of 2009 and 2010 at the experimental farm of University of Tehran, Iran to determine whether intercropping of barley (Hordeum vulgare L.) and annual medic (Medicago scutellata L.) could produce sufficient amount of forage with higher protein content. A four-replicated randomized complete block design with eight cropping patterns [1B:1M (one row of barley: one row of annual medic), 2B:2M, 4B:4M, 6B:6M, 6B:2M, 4B:2M, 2B:4M, and 2B:6M] along with pure stands of barley and annual medic was implemented. Land equivalent ratio (LER) was the highest (1.19) when barley was intercropped with annual medic in 1B:1M arrangement indicating that 19% more land area would be required by a sole cropping system to produce similar yield in intercropping system. Calculated partial LER, aggresivity (A) and competitive ratio (CR) indicated that barley was the dominant species in most of the barley-annual medic cropping patterns. Based on results from LER, system productivity index (SPI) and monetary advantage index (MAI), it was concluded that 1B:1M cropping pattern was superior to either barley or annual medic monocropping. The results of this study revealed that the total protein yield of barley and annual medic forage in the selected intercropping patterns specifically 1B:1M could be enhanced while the total harvested dry matter remained unchanged
Effect of Wheat Cover Crop and Split Nitrogen Application on Corn Yield and Nitrogen Use Efficiency
Corn (Zea mays L.) grain is a major commodity crop in Illinois and its production largely relies on timely application of nitrogen (N) fertilizers. Currently, growers in Illinois and other neighboring states in the U.S. Midwest use the maximum return to N (MRTN) decision support system to predict corn N requirements. However, the current tool does not factor in implications of integrating cover crops into the rotation, which has recently gained attention among growers due to several ecosystem services associated with cover cropping. A two-year field trail was conducted at the Agronomy Research Center in Carbondale, IL in 2018 and 2019 to evaluate whether split N application affects nitrogen use efficiency (NUE) of corn with and without a wheat (Triticum aestivum L.) cover crop. A randomized complete block design with split plot arrangements and four replicates was used. Main plots were cover crop treatments (no cover crop (control) compared to a wheat cover crop) and subplots were N timing applications to the corn: (1) 168 kg N ha−1 at planting; (2) 56 kg N ha−1 at planting + 112 kg N ha−1 at sidedress; (3) 112 kg N ha−1 at planting + 56 kg N ha−1 at sidedress; and (4) 168 kg N ha−1 at sidedress along with a zero-N control as check plot. Corn yield was higher in 2018 than 2019 reflecting more timely precipitation in that year. In 2018, grain yield declined by 12.6% following the wheat cover crop compared to no cover crop control, indicating a yield penalty when corn was preceded with a wheat cover crop. In 2018, a year with timely and sufficient rainfall, there were no yield differences among N treatments and N balances were near zero. In 2019, delaying the N application improved NUE and corn grain yield due to excessive rainfall early in the season reflecting on N losses which was confirmed by lower N balances in sidedressed treatments. Overall, our findings suggest including N credit for cereals in MRTN prediction model could help with improved N management in the Midwestern United States
Orchard recycling improves climate change adaptation and mitigation potential of almond production systems.
There is an urgent need to develop climate smart agroecosystems capable of mitigating climate change and adapting to its effects. In California, high commodity prices and increased frequency of drought have encouraged orchard turnover, providing an opportunity to recycle tree biomass in situ prior to replanting an orchard. Whole orchard recycling (WOR) has potential as a carbon (C) negative cultural practice to build soil C storage, soil health, and orchard productivity. We tested the potential of this practice for long term C sequestration and hypothesized that associated co-benefits to soil health will enhance sustainability and resiliency of almond orchards to water-deficit conditions. We measured soil health metrics and productivity of an almond orchard following grinding and incorporation of woody biomass vs. burning of old orchard biomass 9 years after implementation. We also conducted a deficit irrigation trial with control and deficit irrigation (-20%) treatments to quantify shifts in tree water status and resilience. Biomass recycling led to higher yields and substantial improvement in soil functioning, including nutrient content, aggregation, porosity, and water retention. This practice also sequestered significantly higher levels of C in the topsoil (+5 t ha-1) compared to burning. We measured a 20% increase in irrigation water use efficiency and improved soil and tree water status under stress, suggesting that in situ biomass recycling can be considered as a climate smart practice in California irrigated almond systems