24 research outputs found
Improving phosphorus loss assessment with the apex model and phosphorus index
Doctor of PhilosophyAgronomyNathan O. NelsonAgricultural fields contribute phosphorus (P) to water bodies, which can degrade water quality. The P index (PI) is a tool to assess the risk of P-loss from agricultural fields. However, due to limited measured data, P indices have not been rigorously evaluated. The Agricultural Policy/Environmental Extender (APEX) model could be used to generate P-loss datasets for P index evaluation and revision. The objectives of the study were to i) determine effects of APEX calibration practices on P-loss estimates from diverse management systems, ii) determine fertilizer and poultry litter management effects on P-loss, iii) evaluate and update the Kansas PI using P-loss simulated by APEX and iv) determine appropriate adsorption isotherms with advection-dispersion equation with column leaching experiment. Runoff data from field studies in Franklin and Crawford counties were used to calibrate and validate APEX. Poultry litter and inorganic fertilizer application timing, rate, method, and soil test P concentration effects on P loss were analyzed using location-specific models. A column leaching laboratory study was also conducted to test the adsorption isotherms. Location-specific model satisfactorily simulated runoff, total P (TP) and dissolved P (DP) loss meeting minimum model performance criteria for 2/3 of the tests whereas management-specific models only met the criteria in 1/3 of the tests. Applying manure or fertilizer during late fall resulted in relatively lower TP loss compared to spring applications before planting. The Kansas-PI rating and the APEX simulated P-loss were correlated with r² of 0.40 (p<0.001). Adjusting the weighting factors for Prate, soil test P, and erosion improved the correlation (r² = 0.46; p<0.001. Using a component PI structure and determining the weighting factors by multiple linear regression substantially improved the correlation between the PI and TP loss (r² = 0.69; p<0.001). In the P-leaching experiment, both the linear and nonlinear adsorption isotherms did not fit the experimental data. A multi-reactional advection-dispersion model that better describes all the P processes and complexities in soils should be included in the future. These procedures can provide a roadmap for others interested P transport in soils and using computer models in evaluation, and modifying their PI
Assessing wind damage and potential yield loss in mid-season corn using a geospatial approach
Yield loss due to natural disasters, such as storms with high-speed winds and rainfall, can significantly damage standing corn (Zea mays L.) plants and yield. Using a geospatial approach, the study aimed to estimate green snap wind damage to corn and assess potential yield and economic loss in the Mississippi Delta. Midseason corn (V12–V14) snapping occurred on 8 June 2022. We recorded green snap damage in 13 fields [1.0 to 2.0 hectares (ha−1)] with low (224 kg ha−1) and high (336 kg ha−1) N rates and two different row orientations (north–south and east–west) after the damage. The results indicated no nitrogen rates or row orientation effect on green snap damage. The average yield loss could be ~29.25 kg ha−1, with every 1% increase in green snap wind damage causing significant economic loss to producers. Research methods can help scientists to estimate potential green snap yield loss due to severe winds in the larger fields. Research results can also help estimate potential yield and economic loss to assist producers and other stakeholders in decision-making to prepare for changing weather patterns and unprecedented severe windstorms in the future
Calibration of the APEX Model to Simulate Management Practice Effects on Runoff, Sediment, and Phosphorus Loss
Process-based computer models have been proposed as a tool to generate data for Phosphorus (P) Index assessment and development. Although models are commonly used to simulate P loss from agriculture using managements that are different from the calibration data, this use of models has not been fully tested. The objective of this study is to determine if the Agricultural Policy Environmental eXtender (APEX) model can accurately simulate runoff, sediment, total P, and dissolved P loss from 0.4 to 1.5 ha of agricultural fields with managements that are different from the calibration data. The APEX model was calibrated with field-scale data from eight different managements at two locations (management-specific models). The calibrated models were then validated, either with the same management used for calibration or with different managements. Location models were also developed by calibrating APEX with data from all managements. The management-specific models resulted in satisfactory performance when used to simulate runoff, total P, and dissolved P within their respective systems, with r2 \u3e 0.50, Nash– Sutcliffe efficiency \u3e 0.30, and percent bias within ±35% for runoff and ±70% for total and dissolved P. When applied outside the calibration management, the management-specific models only met the minimum performance criteria in one-third of the tests. The location models had better model performance when applied across all managements compared with management-specific models. Our results suggest that models only be applied within the managements used for calibration and that data be included from multiple management systems for calibration when using models to assess management effects on P loss or evaluate P Indices
Supplemental Material for: Multi-site evaluation of APEX for water quality: II. Regional parameterization
Model performance was assessed using Nash-Sutcliffe model efficiency (NSE), coefficient of determination (r2), and percent bias (PBIAS) as defined by Moriasi et al. (2007 and 2015). Threshold values indicating acceptable model performance based on these statistics are dependent on the spatial and temporal scales of the data, water quality constituents of interest, and the modeling objectives (Moriasi et al., 2015). Although some standard values have been suggested (Moriasi et al., 2007 and 2015), considerable variability exist in the published literature. For instance Ramanarayan et al. (1997) considered r2 \u3e0.5 and NSE \u3e0.40 as satisfactory for simulation of monthly surface water quality with the APEX model. Chung et al. (2002) defined r2 \u3e 0.5 and NSE \u3e 0.3 as satisfactory for monthly tile flow and NO3-N loss simulated with the Erosion Productivity Impact Calculator (EPIC) model. Wang et al. (2008) indicated r2 \u3e 0.5 and NSE \u3e 0.4 as acceptable for monthly runoff and nutrient concentrations using the APEX model. Moriasi et al. (2007) suggested NSE \u3e 0.5 with P-bias ±25% for streamflow, ±55% for sediment and ±70% for nitrogen and phosphorus for monthly values. They also indicated that NSE values can be relaxed for shorter time steps (daily events). Yin et al. (2009) reported NSE for event based runoff and sediment between 0.41-0.84 and r2 between 0.55 - 0.85. Mudgal et al. (2010) regarded r2 \u3e 0.5 and NSE \u3e 0.45 as threshold for satisfactory calibration and validation with event data
Impact of primary kidney disease on the effects of empagliflozin in patients with chronic kidney disease: secondary analyses of the EMPA-KIDNEY trial
Background: The EMPA KIDNEY trial showed that empagliflozin reduced the risk of the primary composite outcome of kidney disease progression or cardiovascular death in patients with chronic kidney disease mainly through slowing progression. We aimed to assess how effects of empagliflozin might differ by primary kidney disease across its broad population. Methods: EMPA-KIDNEY, a randomised, controlled, phase 3 trial, was conducted at 241 centres in eight countries (Canada, China, Germany, Italy, Japan, Malaysia, the UK, and the USA). Patients were eligible if their estimated glomerular filtration rate (eGFR) was 20 to less than 45 mL/min per 1·73 m2, or 45 to less than 90 mL/min per 1·73 m2 with a urinary albumin-to-creatinine ratio (uACR) of 200 mg/g or higher at screening. They were randomly assigned (1:1) to 10 mg oral empagliflozin once daily or matching placebo. Effects on kidney disease progression (defined as a sustained ≥40% eGFR decline from randomisation, end-stage kidney disease, a sustained eGFR below 10 mL/min per 1·73 m2, or death from kidney failure) were assessed using prespecified Cox models, and eGFR slope analyses used shared parameter models. Subgroup comparisons were performed by including relevant interaction terms in models. EMPA-KIDNEY is registered with ClinicalTrials.gov, NCT03594110. Findings: Between May 15, 2019, and April 16, 2021, 6609 participants were randomly assigned and followed up for a median of 2·0 years (IQR 1·5–2·4). Prespecified subgroupings by primary kidney disease included 2057 (31·1%) participants with diabetic kidney disease, 1669 (25·3%) with glomerular disease, 1445 (21·9%) with hypertensive or renovascular disease, and 1438 (21·8%) with other or unknown causes. Kidney disease progression occurred in 384 (11·6%) of 3304 patients in the empagliflozin group and 504 (15·2%) of 3305 patients in the placebo group (hazard ratio 0·71 [95% CI 0·62–0·81]), with no evidence that the relative effect size varied significantly by primary kidney disease (pheterogeneity=0·62). The between-group difference in chronic eGFR slopes (ie, from 2 months to final follow-up) was 1·37 mL/min per 1·73 m2 per year (95% CI 1·16–1·59), representing a 50% (42–58) reduction in the rate of chronic eGFR decline. This relative effect of empagliflozin on chronic eGFR slope was similar in analyses by different primary kidney diseases, including in explorations by type of glomerular disease and diabetes (p values for heterogeneity all >0·1). Interpretation: In a broad range of patients with chronic kidney disease at risk of progression, including a wide range of non-diabetic causes of chronic kidney disease, empagliflozin reduced risk of kidney disease progression. Relative effect sizes were broadly similar irrespective of the cause of primary kidney disease, suggesting that SGLT2 inhibitors should be part of a standard of care to minimise risk of kidney failure in chronic kidney disease. Funding: Boehringer Ingelheim, Eli Lilly, and UK Medical Research Council
Correction to: Cluster identification, selection, and description in Cluster randomized crossover trials: the PREP-IT trials
An amendment to this paper has been published and can be accessed via the original article
Recommended from our members
Global burden of 288 causes of death and life expectancy decomposition in 204 countries and territories and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
BACKGROUND Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING Bill & Melinda Gates Foundation
Using simulated rainfall to evaluate cover crops and winter manure application to limit nutrient loss in runoff
Abstract Cover crops can be effective in minimizing nutrient losses from agricultural fields. The objective of this study was to determine the impact of cover crop (rye, Secale cereale L.) and winter manure application on nutrient loss in simulated rainfall runoff. A split block design study with manure (as vertical block) and cover crops (as horizontal block) was established in 2009. Two rain simulations (the first defined as “dry” and the second “wet”) using sixteen 4 m2 steel frames were conducted in May 2010. The runoff volume collected from each plot was analyzed for nitrate–nitrogen (NO3–N), total suspended solids, total Kjeldahl nitrogen, total phosphorus, and total dissolved phosphorus. In the dry run, the concentration and load of NO3–N were significantly lower (p = 0.05) in runoff with the cover crop than in no‐cover crop treatment. Overall, cover crops reduced nutrient loss in concentration by 6%–48% in the dry and 8%–40% in the wet run than with no‐cover crops. The concentration and load of NO3–N were significantly higher under manure treatments in both “dry” and “wet” runoff runs compared to no‐manure application. Manure application increased nutrient loss in concentration by 6%–58% in the dry and 10%–69% in the wet run than with no‐manure application. This study helps us to understand the complexity of winter manure application with cover crops and potential risks of nutrient loss to surface runoff during spring in the Northern Great plains of the Dakotas
Mixed-Species Cover Crop Biomass Estimation Using Planet Imagery
Cover crop biomass is helpful for weed and pest control, soil erosion control, nutrient recycling, and overall soil health and crop productivity improvement. These benefits may vary based on cover crop species and their biomass. There is growing interest in the agricultural sector of using remotely sensed imagery to estimate cover crop biomass. Four small plot study sites located at the United States Department of Agriculture Agricultural Research Service, Crop Production Systems Research Unit farm, Stoneville, MS with different cereals, legumes, and their mixture as fall-seeded cover crops were selected for this analysis. A randomized complete block design with four replications was used at all four study sites. Cover crop biomass and canopy-level hyperspectral data were collected at the end of April, just before cover crop termination. High-resolution (3 m) PlanetScope imagery (Dove satellite constellation with PS2.SD and PSB.SD sensors) was collected throughout the cover crop season from November to April in the 2021 and 2022 study cycles. Results showed that mixed cover crop increased biomass production up to 24% higher compared to single species rye. Reflectance bands (blue, green, red and near infrared) and vegetation indices derived from imagery collected during March were more strongly correlated with biomass (r = 0–0.74) compared to imagery from November (r = 0.01–0.41) and April (r = 0.03–0.57), suggesting that the timing of imagery acquisition is important for biomass estimation. The highest correlation was observed with the near-infrared band (r = 0.74) during March. The R2 for biomass prediction with the random forest model improved from 0.25 to 0.61 when cover crop species/mix information was added along with Planet imagery bands and vegetation indices as biomass predictors. More study with multiple timepoint biomass, hyperspectral, and imagery collection is needed to choose appropriate bands and estimate the biomass of mix cover crop species