81 research outputs found
Soil structural degradation and nutrient limitations across land use categories and climatic zones in Southern Africa
Although soil degradation is a major threat to food security and carbon sequestration, our knowledge of the spatial extent of the problem and its drivers is very limited in Southern Africa. Therefore, this study aimed to quantify the risk of soil structural degradation and determine the variation in soil stoichiometry and nutrient limitations with land use categories (LUCs) and climatic zones. Using data on soil clay, silt, organic carbon (SOC), total nitrogen (N), available phosphorus (P), and sulfur (S) concentrations collected from 4,468 plots on 29 sites across Angola, Botswana, Malawi, Mozambique, Zambia and Zimbabwe, this study presents novel insights into the variations in soil structural degradation and nutrient limitations. The analysis revealed strikingly consistent stoichiometric coupling of total N, P, and S concentrations with SOC across LUCs. The only exception was on crop land where available P was decoupled from SOC. Across sample plots, the probability (φ) of severe soil structural degradation was 0.52. The probability of SOC concentrations falling below the critical value of 1.5% was 0.49. The probabilities of soil total N, available P, and S concentrations falling below their critical values were 0.95, 0.70, and 0.83, respectively. N limitation occurred with greater probability in woodland (φ = .99) and forestland (φ = .97) than in cropland (φ = .92) and grassland (φ = .90) soils. It is concluded that soil structural degradation, low SOC concentrations, and N and S limitations are widespread across Southern Africa. Therefore, significant changes in policies and practices in land management are needed to reverse the rate of soil structural degradation and increase soil carbon storage
Spatial Variation in Tree Density and Estimated Aboveground Carbon Stocks in Southern Africa
Variability in woody plant species, vegetation assemblages and anthropogenic activities derails the efforts to have common approaches for estimating biomass and carbon stocks in Africa. In order to suggest management options, it is important to understand the vegetation dynamics and the major drivers governing the observed conditions. This study uses data from 29 sentinel landscapes (4640 plots) across the southern Africa. We used T-Square distance method to sample trees. Allometric models were used to estimate aboveground tree biomass from which aboveground biomass carbon stock (AGBCS) was derived for each site. Results show average tree density of 502 trees·ha−1 with semi-arid areas having the highest (682 trees·ha−1) and arid regions the lowest (393 trees·ha−1). The overall AGBCS was 56.4 Mg·ha−1. However, significant site to site variability existed across the region. Over 60 fold differences were noted between the lowest AGBCS (2.2 Mg·ha−1) in the Musungwa plains of Zambia and the highest (138.1 Mg·ha−1) in the scrublands of Kenilworth in Zimbabwe. Semi-arid and humid sites had higher carbon stocks than sites in sub-humid and arid regions. Anthropogenic activities also influenced the observed carbon stocks. Repeated measurements would reveal future trends in tree cover and carbon stocks across different systems
Effect of combining organic manure and inorganic fertilisers on maize–bush bean intercropping
In sub-Saharan Africa (SSA), farmers intercrop common beans with maize but apply inorganic or organic fertilisers targeting only maize. Effects of this practice on bush bean yield have not been fully evaluated with respect to input use and compatibility when intercropped with maize. An on-farm trial managed by smallholder community members was conducted to assess the influence of various soil fertility management options and cropping systems on the yield of two bush bean genotypes (SER45 and SER83) in two agro-ecological zones of Malawi. The farmer-managed trials were laid out in split-plot design, with the bean genotypes as main plots and a combination of the soil fertility management options (i.e., no input, manure, fertiliser and fertiliser + manure) and cropping systems (i.e., sole crop and intercrop) as subplots. The trials were affected by terminal drought and dry spells, but results show that manure and fertiliser application enhanced the resilience of the drought-tolerant bean genotypes. The genotype SER45 was responsive to manure application in the sole crop, giving a 44.4% yield increase over no-manure application. In sole cropping with fertiliser plus manure, bean yields improved by 40.1% for SER45 and 78.3% for SER83 relative to the no-input control. Although sole cropping had higher bean yields, the treatment with manure and fertiliser had a higher land equivalence ratio for intercrop of 1.54 for SER45 and 1.32 for SER83 over sole cropping. These results show that, under smallholder farmer management, the climate adaptability of bush bean genotypes could be enhanced by the combined application of organic and inorganic fertilisers in maize–bean intercrop. The combined application also enhances whole-farm productivity of the common maize–bean intercrop practice than monocrop, hence is of benefit to most low-input smallholder farmers of SSA
Characteristics and outcomes of adult Ethiopian patients enrolled in HIV care and treatment: a multi-clinic observational study
Background: We describe trends in characteristics and outcomes among adults initiating HIV care and treatment in Ethiopia from 2006-2011. Methods: We conducted a retrospective longitudinal analysis of HIV-positive adults (≥15 years) enrolling at 56 Ethiopian health facilities from 2006–2011. We investigated trends over time in the proportion enrolling through provider-initiated counseling and testing (PITC), baseline CD4+ cell counts and WHO stage. Additionally, we assessed outcomes (recorded death, loss to follow-up (LTF), transfer, and total attrition (recorded death plus LTF)) before and after ART initiation. Kaplan-Meier techniques estimated cumulative incidence of these outcomes through 36 months after ART initiation. Factors associated with LTF and death after ART initiation were estimated using Hazard Ratios accounting for within-clinic correlation. Results: 93,418 adults enrolled into HIV care; 53,300 (57%) initiated ART. The proportion enrolled through PITC increased from 27.6% (2006–2007) to 44.8% (2010–2011) (p < .0001). Concurrently, median enrollment CD4+ cell count increased from 158 to 208 cells/mm3 (p < .0001), and patients initiating ART with advanced WHO stage decreased from 56.6% (stage III) and 15.0% (IV) in 2006–2007 to 47.6% (stage III) and 8.5% (IV) in 2010–2011. Median CD4+ cell count at ART initiation remained stable over time. 24% of patients were LTF before ART initiation. Among those initiating ART, attrition was 30% after 36 months, with most occurring within the first 6 months. Recorded death after ART initiation was 6.4% and 9.2% at 6 and 36 months, respectively, and decreased over time. Younger age, male gender, never being married, no formal education, low CD4+ cell count, and advanced WHO stage were associated with increased LTF. Recorded death was lower among younger adults, females, married individuals, those with higher CD4+ cell counts and lower WHO stage at ART initiation. Conclusions: Over time, enrollment in HIV care through outpatient PITC increased and patients enrolled into HIV care at earlier disease stages across all HIV testing points. However, median CD4+ cell count at ART initiation remained steady. Pre- and post-ART attrition (particularly in the first 6 months) have remained major challenges in ensuring prompt ART initiation and retention on ART
Rapid Scale-Up of Antiretroviral Treatment in Ethiopia: Successes and System-Wide Effects
Yibeltal Assefa and colleagues describe the successes and challenges of the scale-up of antiretroviral treatment across Ethiopia, including its impact on other health programs and the country's human resources for health
Burden of malaria among adult patients attending general medical outpatient department and HIV care and treatment clinics in Oromia, Ethiopia: a comparative cross-sectional study
Background
Malaria and HIV/AIDS constitute major public health problems in Ethiopia, but the burden associated with malaria-HIV co-infection has not been well documented. In this study, the burden of malaria among HIV positive and HIV negative adult outpatients attending health facilities in Oromia National Regional State, Ethiopia was investigated.
Methods
A comparative cross-sectional study among HIV-positive patients having routine follow-up visits at HIV care and treatment clinics and HIV-seronegative patients attending the general medical outpatient departments in 12 health facilities during the peak malaria transmission season was conducted from September to November, 2011. A total of 3638 patients (1819 from each group) were enrolled in the study. Provider initiated testing and counseling of HIV was performed for 1831 medical outpatients out of whom 1819 were negative and enrolled into the study. Malaria blood microscopy and hemoglobin testing were performed for all 3638 patients. Data was analyzed using descriptive statistics, Chi square test and multivariate logistic regression.
Results
Of the 3638 patients enrolled in the study, malaria parasitaemia was detected in 156 (4.3 %); malaria parasitaemia prevalence was 0.7 % (13/1819) among HIV-seropositive patients and 7.9 % (143/1819) among HIV-seronegative patients. Among HIV-seropositive individuals 65.4 % slept under a mosquito bed net the night before data collection, compared to 59.4 % of HIV-seronegative individuals. A significantly higher proportion of HIV-seropositive malaria-negative patients were on co-trimoxazole (CTX) prophylaxis as compared to HIV-malaria co-infected patients: 82 % (1481/1806) versus 46 % (6/13) (P = 0.001). HIV and malaria co-infected patients were less likely to have the classical symptoms of malaria (fever, chills and headache) compared to the HIV-seronegative and malaria positive counterparts. Multivariate logistic regression showed that HIV-seropositive patients who come for routine follow up were less likely to be infected by malaria (OR = 0.23, 95 % CI = 0.09–0.74).
Conclusion
The study documented lower malaria prevalence among the HIV-seropositive attendants who come for routine follow up. Clinical symptoms of malaria were more pronounced among HIV-seronegative than HIV-seropositive patients. This study also re-affirmed the importance of co-trimoxazole in preventing malaria symptoms and parasitaemia among HIV- positive patients
Behavioural intention and factors influencing intention of Ethiopian mothers to expose infants to sunshine
[Ethiop. J. Health Dev. 2002;16(1): 31-40
Drug susceptibility pattern of bacterial isolates from children with chronic suppurative otitis media
Background: In Ethiopia, as most developing countries, there is little information on chronic suppurative otitis media (CSOM) in children. Awareness about the seriousness of the condition is limited among the caretakers. Treatment is largely empirical owing to lack of microbiological data.
Objectives: The study was conducted to identify bacterial isolates associated with the chronic suppurative otitis media in children and determine their antibiotic susceptibility pattern.
Methods: Children consecutively seen for chronic otitis media at Department of Pediatrics and Child Health of the Tikur Anbessa Hospital during January - May 2000 were included in the study. Clinical/demographic data were collected using in a pre-formed questionnaire. Ear swabs were collected, transported, and cultured using standards methods. Biochemical tests were used in identifying Gram-negative bacteria. All isolates were tested for their susceptibility to different antibiotics.
Results: A total of 158 bacterial agents were isolated from 120 ear swabs collected from 112 patients aged between 3 months and 12 years. The most frequent isolates were Proteus species (31%), Staphylococcus aureus (18%), Escherichia coli (16%), Klebsiella species (12%), and Pseudomonas species (6%). Most of the isolates were resistant to commonly used antibiotics but sensitive to kanamycin (72%), augmentin (84%) and gentamycin (88%). Triple antibiotics resistance was most common resistance pattern.
Conclusions: The high rate of multiple drug resistance, particularly to cheap and frequently used antibiotics, raises serious concern. More comprehensive studies are required to define the true magnitude of CSOM, to determine the microbiological profile of isolates and produce data for policy decision on optimal intervention modalities.
(Ethiopian Journal of Health Development, 2001, 15(2): 89-96
Replication Data for: Spatial Variation in Tree Density and Estimated Aboveground Carbon Stocks in Southern Africa
Replications dataset for paper that uses large dataset to analyses the spatial variability of above ground biomass carbon
- …