64 research outputs found

    Interactive Effect of Residue Quality and Agroecologies Modulate Soil C- and N-Cycling Enzyme Activities, Microbial Gene Abundance, and Metabolic Quotient

    Get PDF
    Understanding interactive effect of agroecology explained by rainfall, temperature, elevation, and biochemical composition of residues on soil microbial abundance and functions is crucial for unraveling soil ecological processes. This study aimed to investigate how agroecology and residue quality influence enzymatic activities, gene abundance, and metabolic quotient (qCO2). A field experiment was conducted using Leucaena leucocephala (LL) (high-quality residue) and Acacia decurrens (AD) (low-quality residue) in soils of highland and midland agroecologies. These residues differed in decomposability, characterized by a ratio of (lignin + polyphenol)/N of 5.0 for high-quality residue versus 21.0 for low-quality residue. Two experimental setups were employed: soil with litter mixture in polyvinyl chloride (PVC) tubes and residues buried in the surface soil using litterbags. Soil samples were collected after 30, 120, and 270 days of incubation and analyzed for biochemical properties, enzyme activities, and the abundance of nitrifying and total archaea and bacteria. Soil respiration was also measured at different intervals in the field. qCO2 was calculated using microbial biomass (MBC) and daily respiration (DCO2). Linear mixed model (P < 0.05) revealed that combined factors of agroecologies and residue qualities affected enzymatic activities, microbial abundance, soil properties, and qCO2. Agroecological differences exerted a greater influence than residue qualities. Positive and negative significant correlations (P < 0.05, r = 0.27 to 0.67) were found between different C and N pools as well as enzymatic activities. Positive correlations (P < 0.05) were observed between the abundance of total bacteria, total archaea, and ammonia-oxidizing bacteria versus leucine-aminopeptidases. qCO2 was influenced more by β-xylosidase, leucine-aminopeptidases, and thermolysin-like neutral metalloproteases (TLP) than by β-D-glucosidase and β-D-cellobiohydrolase. Leucine-aminopeptidases and TLP were identified as rate-limiting factors for protein and peptide decomposition, while β-xylosidase controlled hemicellulose degradation. In summary, this study provides insights into the intricate relationships between agroecology, residue quality, enzymatic activities, and microbial communities, shedding light on key processes governing soil ecological functions

    Feed balances for ruminant livestock: gridded estimates for data constrained regions

    Get PDF
    Demand for animal−source foods and livestock feed are forecast to increase across sub-Saharan Africa. In this context, there is a need to estimate the availability of livestock feed to support decision−making at local, sub-national and national levels. In this study, we assess feed balances for ruminant livestock in Ethiopia and Burkina Faso. Feed availability was estimated using remotely sensed products and detailed feed composition data. Feed requirements were estimated for maintenance, growth, lactation, gestation and locomotion using a data−intensive model. Biomass available as animal feed was estimated to be 8.6 tonnes of DM per hectare in the Ethiopian highlands and midlands, 3.2 tonnes DM per hectare in the Ethiopian lowlands, 2.9 tonnes DM per hectare in Burkina Faso's Sudanian agro-ecological zone and 1.0 tonne DM per hectare in the Sahel. The energy requirements of lactating cows were estimated to be 62.1 Megajoules (MJs) per animal per day in the Ethiopian highlands and midlands, 62.7 MJ in the Ethiopian lowlands, 88.5 MJ in Burkina Faso's Sudanian agro-ecological zone and 53.1 MJ per animal per day in the Sahel. Feed scarcity hotspots are most prominently located in the Ethiopian highlands and the Sahelian agro-ecological zone of Burkina Faso. Demand−side policy and investment initiatives can address hotspots by influencing herd sizes, nutritional requirements and herd mobility. Supply−side policy and investment initiatives can secure existing feed resources, develop new sources of feed and incentivise trade in feed resources. Improving feed balances will be of value to decision−makers with the aims of optimising livestock productivity, minimising exposure to climatic shocks and minimising greenhouse gas emission intensity.</p

    Characterizing and mapping cropping patterns in a complex agro-ecosystem: An iterative participatory mapping procedure using machine learning algorithms and MODIS vegetation indices

    Get PDF
    Accurate and up-to-date spatial agricultural information is essential for applications including agro-environmental assessment, crop management, and appropriate targeting of agricultural technologies. There is growing research interest in spatial analysis of agricultural ecosystems applying satellite remote sensing technologies. However, usability of information generated from many of remotely sensed data is often constrained by accuracy problems. This is of particular concern in mapping complex agro-ecosystems in countries where small farm holdings are dominated by diverse crop types. This study is a contribution to the ongoing efforts towards overcoming accuracy challenges faced in remote sensing of agricultural ecosystems. We applied time-series analysis of vegetation indices (Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI)) derived from the Moderate Resolution Imaging Spectrometer (MODIS) sensor to detect seasonal patterns of irrigated and rainfed cropping patterns in five townships in the Central Dry Zone of Myanmar, which is an important agricultural region of the country has been poorly mapped with respect to cropping practices. To improve mapping accuracy and map legend completeness, we implemented a combination of (i) an iterative participatory approach to field data collection and classification, (ii) the identification of appropriate size and types of predictor variables (VIs), and (iii) evaluation of the suitability of three Machine Learning algorithms: Support Vector Machine (SVM), Random Forest (RF), and C5.0 algorithms under varying training sample sizes. Through these procedures, we were able to progressively improve accuracy and achieve maximum overall accuracy of 95% When a small sized training dataset was used, accuracy achieved by RF was significantly higher compared to SVM and C5.0 (P < 0.01), but as sample size increased, accuracy differences among the three machine learning algorithms diminished. Accuracy achieved by use of NDVI was consistently better than that of EVI (P < 0.01). The maximum overall accuracy was achieved using RF and 8-days NDVI composites for three years of remote sensing data. In conclusion, our findings highlight the important role of participatory classification, especially in areas where cropping systems are highly diverse and differ over space and time. We also show that the choice of classifiers and size of predictor variables are essential and complementary to the participatory mapping approach in achieving desired accuracy of cropping pattern mapping in areas where other sources of spatial information are scarce

    Landscape-based nutrient application in wheat and teff mixed farming systems of Ethiopia: farmer and extension agent demand driven approach

    Get PDF
    Introduction: Adapting fertilizer use is crucial if smallholder agroecosystems are to attain the sustainable development goals of zero hunger and agroecosystem resilience. Poor soil health and nutrient variability characterize the smallholder farming systems. However, the current research at the field scale does not account for nutrient variability across landscape positions, posing significant challenges for targeted nutrient management interventions. The purpose of this research was to create a demand-driven and co-development approach for diagnosing farmer nutrient management practices and determining landscape-specific (hillslope, mid-slope, and foot slope) fertilizer applications for teff and wheat. Method: A landscape segmentation approach was aimed to address gaps in farm-scale nutrient management research as well as the limitations of blanket recommendations to meet local nutrient requirements. This approach incorporates the concept of interconnected socio-technical systems as well as the concepts and procedures of co-development. A smart mobile app was used by extension agents to generate crop-specific decision rules at the landscape scale and forward the specific fertilizer applications to target farmers through SMS messages or print formats. Results and discussion: The findings reveal that farmers apply more fertilizer to hillslopes and less to mid- and foot slopes. However, landscape-specific fertilizer application guided by crop-specific decision rules via mobile applications resulted in much higher yield improvements, 23% and 56% at foot slopes and 21% and 6.5% at mid slopes for wheat and teff, respectively. The optimized net benefit per hectare increase over the current extension recommendation was 176and176 and 333 at foot slopes and 159and159 and 64 at mid slopes for wheat and teff (average of 90and90 and 107 for wheat and teff), respectively. The results of the net benefit-to-cost ratio (BCR) demonstrated that applying landscape-targeted fertilizer resulted in an optimum return on investment (10.0netprofitper10.0 net profit per 1.0 investment) while also enhancing nutrient use efficiency across the three landscape positions. Farmers are now cognizant of the need to reduce fertilizer rates on hillslopes while increasing them on parcels at mid- and foot-slope landscapes, which have higher responses and profits. As a result, applying digital advisory to optimize landscape-targeted fertilizer management gives agronomic, economic, and environmental benefits. The outcomes results of the innovation also contribute to overcoming site-specific yield gaps and low nutrient use efficiency, they have the potential to be scaled if complementing innovations and scaling factors are integrated

    Influence of landscape position on sorghum yield response to different nutrient sources and soil properties in the semi-arid tropical environment

    Get PDF
    Understanding the response of crops to nutrient applications in undulating landscapes is imperative to improve nutrient use efficiency and crop yield. This study aimed to identify sorghum yield-limiting nutrients and characterize soil properties targeting landscape positions. The field experiments were conducted across 52 sites in four districts, covering three distinct landscape positions during the 2020 and 2022 cropping seasons. The treatments were All-blended, All- compound, All- individual, 150% of All- blended, All- blended-K, All- blended-S, All-blended-Zn, All -blended-B, recommended NP, 50% of All -blended, and control (no fertilizer). Treatment sequencing was randomized using a complete block design under foot slope (FS), mid-slope (MS), and hillslope (HS) positions. Results revealed that landscape position significantly affected the growth and yield of sorghum. Significantly higher yields were obtained from foot slopes than mid-slope and hillslope positions. Yield response to the application of nutrients significantly decreased with increasing slope. Overall, yield among all landscape positions was in the decreasing order of FS>MS>HS. The application of nutrients at different rates significantly improved sorghum total biomass and grain yield. Raising the all-blended treatment rate by 50% increased sorghum yield by 44% and 147% over the application of 50% of all nutrients and the unfertilized control treatment, respectively. Statistically significant yield differences were not observed among blended, compound, and separate applications of nutrients. The omission of K, S, Zn, and B did not show a significant variation in yield over the recommended NP fertilizer. The results of soil analysis results revealed that N and P are the most commonly deficient nutrients in sorghum-growing areas. The mean average volumetric soil moisture content ranged from 5.9-28.7% across landscape positions, with the highest at the foot slope and lowest at the hillslope position. Further research is suggested to determine economically optimum N and P rates across the three landscape positions

    Burden of disease scenarios for 204 countries and territories, 2022–2050: a forecasting analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Future trends in disease burden and drivers of health are of great interest to policy makers and the public at large. This information can be used for policy and long-term health investment, planning, and prioritisation. We have expanded and improved upon previous forecasts produced as part of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) and provide a reference forecast (the most likely future), and alternative scenarios assessing disease burden trajectories if selected sets of risk factors were eliminated from current levels by 2050. Methods: Using forecasts of major drivers of health such as the Socio-demographic Index (SDI; a composite measure of lag-distributed income per capita, mean years of education, and total fertility under 25 years of age) and the full set of risk factor exposures captured by GBD, we provide cause-specific forecasts of mortality, years of life lost (YLLs), years lived with disability (YLDs), and disability-adjusted life-years (DALYs) by age and sex from 2022 to 2050 for 204 countries and territories, 21 GBD regions, seven super-regions, and the world. All analyses were done at the cause-specific level so that only risk factors deemed causal by the GBD comparative risk assessment influenced future trajectories of mortality for each disease. Cause-specific mortality was modelled using mixed-effects models with SDI and time as the main covariates, and the combined impact of causal risk factors as an offset in the model. At the all-cause mortality level, we captured unexplained variation by modelling residuals with an autoregressive integrated moving average model with drift attenuation. These all-cause forecasts constrained the cause-specific forecasts at successively deeper levels of the GBD cause hierarchy using cascading mortality models, thus ensuring a robust estimate of cause-specific mortality. For non-fatal measures (eg, low back pain), incidence and prevalence were forecasted from mixed-effects models with SDI as the main covariate, and YLDs were computed from the resulting prevalence forecasts and average disability weights from GBD. Alternative future scenarios were constructed by replacing appropriate reference trajectories for risk factors with hypothetical trajectories of gradual elimination of risk factor exposure from current levels to 2050. The scenarios were constructed from various sets of risk factors: environmental risks (Safer Environment scenario), risks associated with communicable, maternal, neonatal, and nutritional diseases (CMNNs; Improved Childhood Nutrition and Vaccination scenario), risks associated with major non-communicable diseases (NCDs; Improved Behavioural and Metabolic Risks scenario), and the combined effects of these three scenarios. Using the Shared Socioeconomic Pathways climate scenarios SSP2-4.5 as reference and SSP1-1.9 as an optimistic alternative in the Safer Environment scenario, we accounted for climate change impact on health by using the most recent Intergovernmental Panel on Climate Change temperature forecasts and published trajectories of ambient air pollution for the same two scenarios. Life expectancy and healthy life expectancy were computed using standard methods. The forecasting framework includes computing the age-sex-specific future population for each location and separately for each scenario. 95% uncertainty intervals (UIs) for each individual future estimate were derived from the 2·5th and 97·5th percentiles of distributions generated from propagating 500 draws through the multistage computational pipeline. Findings: In the reference scenario forecast, global and super-regional life expectancy increased from 2022 to 2050, but improvement was at a slower pace than in the three decades preceding the COVID-19 pandemic (beginning in 2020). Gains in future life expectancy were forecasted to be greatest in super-regions with comparatively low life expectancies (such as sub-Saharan Africa) compared with super-regions with higher life expectancies (such as the high-income super-region), leading to a trend towards convergence in life expectancy across locations between now and 2050. At the super-region level, forecasted healthy life expectancy patterns were similar to those of life expectancies. Forecasts for the reference scenario found that health will improve in the coming decades, with all-cause age-standardised DALY rates decreasing in every GBD super-region. The total DALY burden measured in counts, however, will increase in every super-region, largely a function of population ageing and growth. We also forecasted that both DALY counts and age-standardised DALY rates will continue to shift from CMNNs to NCDs, with the most pronounced shifts occurring in sub-Saharan Africa (60·1% [95% UI 56·8–63·1] of DALYs were from CMNNs in 2022 compared with 35·8% [31·0–45·0] in 2050) and south Asia (31·7% [29·2–34·1] to 15·5% [13·7–17·5]). This shift is reflected in the leading global causes of DALYs, with the top four causes in 2050 being ischaemic heart disease, stroke, diabetes, and chronic obstructive pulmonary disease, compared with 2022, with ischaemic heart disease, neonatal disorders, stroke, and lower respiratory infections at the top. The global proportion of DALYs due to YLDs likewise increased from 33·8% (27·4–40·3) to 41·1% (33·9–48·1) from 2022 to 2050, demonstrating an important shift in overall disease burden towards morbidity and away from premature death. The largest shift of this kind was forecasted for sub-Saharan Africa, from 20·1% (15·6–25·3) of DALYs due to YLDs in 2022 to 35·6% (26·5–43·0) in 2050. In the assessment of alternative future scenarios, the combined effects of the scenarios (Safer Environment, Improved Childhood Nutrition and Vaccination, and Improved Behavioural and Metabolic Risks scenarios) demonstrated an important decrease in the global burden of DALYs in 2050 of 15·4% (13·5–17·5) compared with the reference scenario, with decreases across super-regions ranging from 10·4% (9·7–11·3) in the high-income super-region to 23·9% (20·7–27·3) in north Africa and the Middle East. The Safer Environment scenario had its largest decrease in sub-Saharan Africa (5·2% [3·5–6·8]), the Improved Behavioural and Metabolic Risks scenario in north Africa and the Middle East (23·2% [20·2–26·5]), and the Improved Nutrition and Vaccination scenario in sub-Saharan Africa (2·0% [–0·6 to 3·6]). Interpretation: Globally, life expectancy and age-standardised disease burden were forecasted to improve between 2022 and 2050, with the majority of the burden continuing to shift from CMNNs to NCDs. That said, continued progress on reducing the CMNN disease burden will be dependent on maintaining investment in and policy emphasis on CMNN disease prevention and treatment. Mostly due to growth and ageing of populations, the number of deaths and DALYs due to all causes combined will generally increase. By constructing alternative future scenarios wherein certain risk exposures are eliminated by 2050, we have shown that opportunities exist to substantially improve health outcomes in the future through concerted efforts to prevent exposure to well established risk factors and to expand access to key health interventions

    Global burden and strength of evidence for 88 risk factors in 204 countries and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Understanding the health consequences associated with exposure to risk factors is necessary to inform public health policy and practice. To systematically quantify the contributions of risk factor exposures to specific health outcomes, the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 aims to provide comprehensive estimates of exposure levels, relative health risks, and attributable burden of disease for 88 risk factors in 204 countries and territories and 811 subnational locations, from 1990 to 2021. Methods: The GBD 2021 risk factor analysis used data from 54 561 total distinct sources to produce epidemiological estimates for 88 risk factors and their associated health outcomes for a total of 631 risk–outcome pairs. Pairs were included on the basis of data-driven determination of a risk–outcome association. Age-sex-location-year-specific estimates were generated at global, regional, and national levels. Our approach followed the comparative risk assessment framework predicated on a causal web of hierarchically organised, potentially combinative, modifiable risks. Relative risks (RRs) of a given outcome occurring as a function of risk factor exposure were estimated separately for each risk–outcome pair, and summary exposure values (SEVs), representing risk-weighted exposure prevalence, and theoretical minimum risk exposure levels (TMRELs) were estimated for each risk factor. These estimates were used to calculate the population attributable fraction (PAF; ie, the proportional change in health risk that would occur if exposure to a risk factor were reduced to the TMREL). The product of PAFs and disease burden associated with a given outcome, measured in disability-adjusted life-years (DALYs), yielded measures of attributable burden (ie, the proportion of total disease burden attributable to a particular risk factor or combination of risk factors). Adjustments for mediation were applied to account for relationships involving risk factors that act indirectly on outcomes via intermediate risks. Attributable burden estimates were stratified by Socio-demographic Index (SDI) quintile and presented as counts, age-standardised rates, and rankings. To complement estimates of RR and attributable burden, newly developed burden of proof risk function (BPRF) methods were applied to yield supplementary, conservative interpretations of risk–outcome associations based on the consistency of underlying evidence, accounting for unexplained heterogeneity between input data from different studies. Estimates reported represent the mean value across 500 draws from the estimate's distribution, with 95% uncertainty intervals (UIs) calculated as the 2·5th and 97·5th percentile values across the draws. Findings: Among the specific risk factors analysed for this study, particulate matter air pollution was the leading contributor to the global disease burden in 2021, contributing 8·0% (95% UI 6·7–9·4) of total DALYs, followed by high systolic blood pressure (SBP; 7·8% [6·4–9·2]), smoking (5·7% [4·7–6·8]), low birthweight and short gestation (5·6% [4·8–6·3]), and high fasting plasma glucose (FPG; 5·4% [4·8–6·0]). For younger demographics (ie, those aged 0–4 years and 5–14 years), risks such as low birthweight and short gestation and unsafe water, sanitation, and handwashing (WaSH) were among the leading risk factors, while for older age groups, metabolic risks such as high SBP, high body-mass index (BMI), high FPG, and high LDL cholesterol had a greater impact. From 2000 to 2021, there was an observable shift in global health challenges, marked by a decline in the number of all-age DALYs broadly attributable to behavioural risks (decrease of 20·7% [13·9–27·7]) and environmental and occupational risks (decrease of 22·0% [15·5–28·8]), coupled with a 49·4% (42·3–56·9) increase in DALYs attributable to metabolic risks, all reflecting ageing populations and changing lifestyles on a global scale. Age-standardised global DALY rates attributable to high BMI and high FPG rose considerably (15·7% [9·9–21·7] for high BMI and 7·9% [3·3–12·9] for high FPG) over this period, with exposure to these risks increasing annually at rates of 1·8% (1·6–1·9) for high BMI and 1·3% (1·1–1·5) for high FPG. By contrast, the global risk-attributable burden and exposure to many other risk factors declined, notably for risks such as child growth failure and unsafe water source, with age-standardised attributable DALYs decreasing by 71·5% (64·4–78·8) for child growth failure and 66·3% (60·2–72·0) for unsafe water source. We separated risk factors into three groups according to trajectory over time: those with a decreasing attributable burden, due largely to declining risk exposure (eg, diet high in trans-fat and household air pollution) but also to proportionally smaller child and youth populations (eg, child and maternal malnutrition); those for which the burden increased moderately in spite of declining risk exposure, due largely to population ageing (eg, smoking); and those for which the burden increased considerably due to both increasing risk exposure and population ageing (eg, ambient particulate matter air pollution, high BMI, high FPG, and high SBP). Interpretation: Substantial progress has been made in reducing the global disease burden attributable to a range of risk factors, particularly those related to maternal and child health, WaSH, and household air pollution. Maintaining efforts to minimise the impact of these risk factors, especially in low SDI locations, is necessary to sustain progress. Successes in moderating the smoking-related burden by reducing risk exposure highlight the need to advance policies that reduce exposure to other leading risk factors such as ambient particulate matter air pollution and high SBP. Troubling increases in high FPG, high BMI, and other risk factors related to obesity and metabolic syndrome indicate an urgent need to identify and implement interventions
    corecore