18 research outputs found
Subsoil improvement for sustainable intensification : impact of loosening with straw incorporation or liming on subsoil properties, crop performance and water quality
Subsoil has a high capacity for nutrient and water retention, but arable subsoil is often nutrient poor, carbon-deficient and compacted, affecting both root growth and yield. In field and lysimeter experiments, this thesis investigated the effects of subsoil loosening and loosening with cereal straw incorporation (24-60 Mg ha-1) (loosening + straw) on crop yield, soil properties (bulk density, penetration resistance, moisture characteristics) and leaching. A rectangular metal tube welded behind each tine of a deep loosener was used to inject straw as a slurry in the field, while subsoil was loosened and mixed manually with milled straw in lysimeter studies. In laboratory experiments, subsoil was limed with different amounts of CaCO3 and CaO to increase soil pH from 7.0 to 7.5, 8.0 and 8.4 and incubated for 22 months to examine changes in soil structural stability and dissolved reactive phosphorus. Field subsoil loosening + straw significantly increased soil organic carbon, total nitrogen and water holding capacity. It also decreased bulk density, from around 1.5 Mg m-3 in the control to about 1.0 Mg m-3. The effects of loosening + straw persisted for at least three years, but loosening alone had weak and short-lived effects. Loosening + straw significantly increased grain yield in the first cropping season (6% higher than the control), but not in the following two years. Nitrogen balance calculations of lysimeters showed that short-term nitrogen losses were lowest in the subsoil loosening + straw treatment and that nitrogen leaching was reduced by about 62%. In incubations, subsoil liming decreased clay dispersion. Wet aggregate stability and concentration of dissolved reactive phosphorus increased and peaked around pH 7.8 and 7.5, respectively. Combining loosening with straw incorporation into subsoil appeared to improve soil properties and water quality, but not crop yield on the experimental soil. On other soil types, this practice may have more beneficial effects
Evaluation of process based model 3-PG for simulation of net primary production of Picea abies in northern and southern regions of Sweden under climate change
The results of this evaluation reveal good performance of 3-PG and seems reasonable to use it for simulation of net primary production in Sweden. Subsequently,the NPP of Picea abies was simulated using 3-PG for 110 years in northern and southern Sweden under climate change. RCA3 generated climate data on two emission scenarios (A2 and B2) was used in the simulations as driving variables. The initial stand data and site factors were taken from well known sites in northern and southern Sweden to determine fertility rating input factor of 3-PG and to use for input data for Heureka StandWise and 3-PG for simulation and valida-tion. The outcome from the simulation of 2071-2100 in A2 and B2 scenario were summa-rized for 2071-75, 2076-80, 2081-85, 2086-90, 2091-95 & 2096-2100 and compared against their corresponding reference years (1961-1990). The average relative increment of NPP af-ter 110 years was 89,7% and 60,5 % for A2 and B2 in northern and 88,6% & 60,3% for A2 & B2 of southern Sweden respectively. Higher relative increase of temperature in autumn, spring & winter in northern Sweden led to higher relative increase of NPP in northern than Southern Sweden in both scenarios. Sensitivity testing of the model based on predicted NPP was carried out independently for temperature, rainfall and fertility rating. The result point-ed-out that NPP from 3-PG was more sensitive for fertility rating than for temperature and rainfall. Rainfall was almost indifferent for the test. Sensitivity of the factors considered in the exercise was found to be site dependent. Total biomass outputs from 3-PG and Heureka StandWise simulations were compared for validation. There was no significance difference between total biomass from the two models. Modeling efficiency was 78,5 % for northern and 89 % for southern Sweden. The average model bias explained the error with 8,6% and -3,2%; the mean absolute difference outcome was about 8,6% and 7% and the root mean square error was13% and 9,5% in northern & southern regions respectively. Overall, the results from this work suggest that there were possibilities to use 3-PG for predicting NPP in Sweden with due considerations of thinning operation, determination of fertility rating and Leaf area index outcomes
Prevalence and associated factors of diabetic nephropathy at Tikur Anbessa Comprehensive Specialized University Hospital, Addis Ababa, Ethiopia
Introduction: Given the global prevalence of diabetes, diabetic nephropathy and its consequences are among the major causes of morbidity and mortality in diabetic populations. However, the prevalence and determinants of diabetic nephropathy in Ethiopia are little studied, and were the main objectives of this study.
Methods: A cross-sectional study design was followed among 340 randomly selected diabetic patients attending the national diabetes referral clinics at the diabetes centre of Tikur Anbessa Specialized Hospital, Addis Ababa, using an interviewer-administered structured questionnaire. A total of 340 patients were involved, of whom 200 (59%) were females and 256 (75%) had type 2 diabetes mellitus. Urine and blood samples were drawn from the study population and the corresponding biochemical analyses were conducted at the Ethiopian Public Health Research Institute.
Results: The mean age of the participants was 51.6 years (range 18–94 years). The median duration of their diabetes was 11 years (range 1–40 years). Forty-eight pecent of the patients were hypertensive. Only half of the hypertensive cases (53%) were using angiotensin-converting enzyme inhibitors, either alone or in combination with other antihypertensive medicines. Eighty-two percent of the participants had poorly controlled diabetes, with glycated haemoglobin >7%. None was using Sodium-glucose cotransporter 2 (SGLT2) inhibitors or glucagon-like peptide agonists. Some (109, 32%) of the participants were diagnosed with diabetic nephropathy in addition to reduced estimated glomerular filtration rate and albuminuria. Age, dyslipidaemia, educational status, presence of diabetic retinopathy, and elevated triglyceride levels were found to be significant predictors of the condition (P < 0.05).
Conclusions: Diabetic nephropathy was present in nearly one-third of the diabetics in the study population. The management of diabetes with renoprotective agents, such as renin–angiotensin–aldosterone system inhibitors and SGLT2 inhibitors, are likely to be very important in this context
Hypogonadism and associated risk factors in male patients with type 2 diabetes mellitus attending the diabetic clinic of Tikur Anbessa Specialized Teaching Hospital, Addis Ababa, Ethiopia
Background: A high prevalence of hypogonadism among men with type 2 diabetes mellitus (T2DM) has been reported worldwide. This in turn creates a substantial public health burden in terms of inadequate sexual function and potential infertility. However, the status of this health problem is not well established in Ethiopia. Therefore, this study was aimed to assess hypogonadism and its associated risk factors among men with T2DM.Methods: This cross-sectional study was conducted at Tikur Anbesa Specialized Teaching Hospital in Addis Ababa, Ethiopia from February to May 2017 on 115 male patients with T2DM aged 40–80 years. Symptoms of hypogonadism were assessed using the Androgen Deficiency in Aging Men (ADAM) questionnaire. Total testosterone (TT), luteinising hormone (LH) and follicle stimulating hormone (FSH), fasting blood glucose (FBG) and lipid profiles were measured at the clinical chemistry laboratory of Ethiopian Public Health Institute. Hypogonadism was defined as the presence of clinical symptoms and low TT [TT < 12.1 nmol/l] according to International Society for the Study of the Aging Male.Results: Of the total 115 study subjects, hypogonadism was seen in 23.5%, of whom 74.1% and 25.9% had secondary and primary hypogonadism, respectively. TT showed a significant negative correlation with waist circumference (WC) (r = −0.465, p < 0.001), BMI (r = −0.363; p < 0.001), FBG (rho = −0.328, p < 0.001) and TG (rho = −0.357, p < 0.001) respectively but a significant positive correlation with HDL-C (r = 0.339, p < 0.001)]. WC and FBG were independently associated with hypogonadism.Conclusion: According to our study, visceral obesity and hyperglycaemia were found to be independent risk factors associated with hypogonadism
Effects of loosening combined with straw incorporation into the upper subsoil on soil properties and crop yield in a three-year field experiment
Subsoil management needs to be integrated into the current tillage regimes in order to access additional resources of water and nutrients and sustain crop production. However, arable subsoil is often deficient in nutrients and carbon, and it is compacted, affecting root growth and yield. In this study, crop yield and soil responses to loosening of the upper subsoil, without and with straw injection below the plough layer (25-34 cm), were studied during three crop cycles (2016-2018) in a field experiment near Uppsala, Sweden. Responses to straw injection after loosening were studied after single and triple consecutive applications of 24-30 Mg ha-1 during 2015-2017 to spring-sown barley and oats. Subsoil loosening combined with one-time or repeated straw addition (LS treatments) significantly reduced soil bulk density (BD) and increased porosity, soil organic carbon (SOC) and total nitrogen (N) compared with loosening (L) alone (one-time or repeated annually) and the control. In treatment L, the soil re-compacted over time to a similar level as in the control. Field inspections indicated higher abundance of earthworms and biopores in and close to straw incorporation strips. Aggregates readily crumbled/fragmented by hand and casts (fine crumbs) were frequently observed in earthworm burrows. The treatment LS improved soil properties (SOC and porosity) and water holding capacity, but had no significant influence on crop yield compared with the control. Crop yield in all treatments was 6.5-6.8 Mg ha-1 in 2017 and 3.8-4.0 Mg ha-1 in 2018, and differences were non-significant. Absence of yield effect due to treatments could be possibly due to other confounding factors buffering expression of treatment effects on yield. Lower relative chlorophyll content in leaves in the loosening with straw treatment during early growth stages, did not affect final crop yield. Subsoil loosening performed three times gave no further improvement in soil properties and grain yield compared with one-time loosening. There was no difference in yield between repeated subsoil loosening + straw and one-time treatment. It will be interesting to study the long-term effects of deep straw injection and evaluate its impact under other soil and weather conditions
Lysimeter deep N fertilizer placement reduced leaching and improved N use efficiency
Deep fertilization has been tested widely for nitrogen (N) use efficiency but there is little evidence of its impact on N leaching and the interplay between climate factors and crop N use. In this study, we tested the effect of three fertilizer N placements on leaching, crop growth, and greenhouse gas (GHG) emissions in a lysimeter experiment over three consecutive years with spring-sown cereals (S1, S2, and S3). Leaching was additionally monitored in an 11-month fallow period (F1) preceding S1 and a 15-month fallow period (F2) following S3. In addition to a control with no N fertilizer (Control), 100 kg N ha(-1) year(-1) of ammonium nitrate was placed at 0.2 m (Deep), 0.07 m (Shallow), or halved between 0.07 m and 0.2 m (Mixed). Deep reduced leachate amount in each cropping period, with significant reductions (p < 0.05) in the drought year (S2) and cumulatively for S1-S3. Overall, Deep reduced leaching by 22, 25 and 34% compared to Shallow, Mixed and Control, respectively. Deep and Mixed reduced N leaching across S1-S3 compared with Shallow, but Deep further reduced N loads by 15% compared to Mixed and was significantly lowest (p < 0.05) among the fertilized treatments in S1 and S2. In S3, Deep increased grain yields by 28 and 22% compared to Shallow and Mixed, respectively, while nearly doubling the agronomic efficiency of N (AE(N)) and the recovery efficiency of N (REN). Deep N placement is a promising mitigation practice that should be further investigated
Measuring progress and projecting attainment on the basis of past trends of the health-related Sustainable Development Goals in 188 countries: an analysis from the Global Burden of Disease Study 2016
The UN’s Sustainable Development Goals (SDGs) are grounded in the global ambition of “leaving no one behind”. Understanding today’s gains and gaps for the health-related SDGs is essential for decision makers as they aim to improve the health of populations. As part of the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016), we measured 37 of the 50 health-related SDG indicators over the period 1990–2016 for 188 countries, and then on the basis of these past trends, we projected indicators to 2030
Global, regional, and national disability-adjusted life-years (DALYs) for 333 diseases and injuries and healthy life expectancy (HALE) for 195 countries and territories, 1990–2016: a systematic analysis for the Global Burden of Disease Study 2016
BACKGROUND: Measurement of changes in health across locations is useful to compare and contrast changing epidemiological patterns against health system performance and identify specific needs for resource allocation in research, policy development, and programme decision making. Using the Global Burden of Diseases, Injuries, and Risk Factors Study 2016, we drew from two widely used summary measures to monitor such changes in population health: disability-adjusted life-years (DALYs) and healthy life expectancy (HALE). We used these measures to track trends and benchmark progress compared with expected trends on the basis of the Socio-demographic Index (SDI).
METHODS: We used results from the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 for all-cause mortality, cause-specific mortality, and non-fatal disease burden to derive HALE and DALYs by sex for 195 countries and territories from 1990 to 2016. We calculated DALYs by summing years of life lost and years of life lived with disability for each location, age group, sex, and year. We estimated HALE using age-specific death rates and years of life lived with disability per capita. We explored how DALYs and HALE differed from expected trends when compared with the SDI: the geometric mean of income per person, educational attainment in the population older than age 15 years, and total fertility rate.
FINDINGS: The highest globally observed HALE at birth for both women and men was in Singapore, at 75·2 years (95% uncertainty interval 71·9-78·6) for females and 72·0 years (68·8-75·1) for males. The lowest for females was in the Central African Republic (45·6 years [42·0-49·5]) and for males was in Lesotho (41·5 years [39·0-44·0]). From 1990 to 2016, global HALE increased by an average of 6·24 years (5·97-6·48) for both sexes combined. Global HALE increased by 6·04 years (5·74-6·27) for males and 6·49 years (6·08-6·77) for females, whereas HALE at age 65 years increased by 1·78 years (1·61-1·93) for males and 1·96 years (1·69-2·13) for females. Total global DALYs remained largely unchanged from 1990 to 2016 (-2·3% [-5·9 to 0·9]), with decreases in communicable, maternal, neonatal, and nutritional (CMNN) disease DALYs offset by increased DALYs due to non-communicable diseases (NCDs). The exemplars, calculated as the five lowest ratios of observed to expected age-standardised DALY rates in 2016, were Nicaragua, Costa Rica, the Maldives, Peru, and Israel. The leading three causes of DALYs globally were ischaemic heart disease, cerebrovascular disease, and lower respiratory infections, comprising 16·1% of all DALYs. Total DALYs and age-standardised DALY rates due to most CMNN causes decreased from 1990 to 2016. Conversely, the total DALY burden rose for most NCDs; however, age-standardised DALY rates due to NCDs declined globally.
INTERPRETATION: At a global level, DALYs and HALE continue to show improvements. At the same time, we observe that many populations are facing growing functional health loss. Rising SDI was associated with increases in cumulative years of life lived with disability and decreases in CMNN DALYs offset by increased NCD DALYs. Relative compression of morbidity highlights the importance of continued health interventions, which has changed in most locations in pace with the gross domestic product per person, education, and family planning. The analysis of DALYs and HALE and their relationship to SDI represents a robust framework with which to benchmark location-specific health performance. Country-specific drivers of disease burden, particularly for causes with higher-than-expected DALYs, should inform health policies, health system improvement initiatives, targeted prevention efforts, and development assistance for health, including financial and research investments for all countries, regardless of their level of sociodemographic development. The presence of countries that substantially outperform others suggests the need for increased scrutiny for proven examples of best practices, which can help to extend gains, whereas the presence of underperforming countries suggests the need for devotion of extra attention to health systems that need more robust support.
FUNDING: Bill & Melinda Gates Foundation
Recommended from our members
Global burden of 288 causes of death and life expectancy decomposition in 204 countries and territories and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
BACKGROUND Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING Bill & Melinda Gates Foundation
Liming with CaCO3 or CaO affects aggregate stability and dissolved reactive phosphorus in a heavy clay subsoil
A 22-month incubation experiment was conducted to study the effect of lime on clay dispersion, wet aggregate stability (WAS) and dissolved reactive phosphorus (DRP), using a heavy clay subsoil with an initial pH of 7.0 and 7.3 g kg(-1) of soil organic carbon. Lime was applied to achieve soil pH values of 7.5, 8 and 8.4. Clay dispersion decreased linearly with increased pH (corresponding to an increase in lime amount) for both lime types (R-2 = 0.44 for CaO; R-2 = 0.53 for CaCO3, P < 0.05), with a decrease of 2-16 % (CaO) and 3-17 % (CaCO3) compared with the control.Both WAS and DRP followed piece-wise linear functions, with an increase and peak around pH 7.5-7.8, and a decline at higher pH (WAS: R-2 = 0.73 for CaO, R-2 = 0.68 for CaCO3, P < 0.001; DRP: R-2 = 0.84 for CaCO3, R-2 = 0.33 for CaO, P < 0.001). Wet aggregate stability increased on average by 13 % and 11 % at the lowest and intermediate levels, respectively, compared with the control. At the highest lime application rate, WAS was 6 % (CaO) and 8 % (CaCO3) lower than in the control. These differences were probably caused by changes in electrical charge and in concentrations of soluble calcium and dissolved organic carbon (DOC) as the pH increased. More studies are needed to understand the processes in detail and to draw conclusions that are more robust