24 research outputs found

    Impact of African Traditional Termite Control Methods on Conservation of Biodiversity: A Review

    Get PDF
    Termites cause a wide array of damages to plantations, trees and manmade structures. Both industrial and food crops as well as forest trees are attacked by termite. Reports on economic losses caused by termite infestations have been made worldwide. Consequently, need to eliminate and control termites have risen especially by farmers and property owners as a measure to secure their produce and property respectively. Poverty prevalence in Africa limits accessibility to chemical control methods which are expensive and requires expertise to use hence unaffordable. In response, African communities have devised and applied traditional methods in order to eliminate and or control termites in farmlands and homes. However, these methods have not been scientifically proven and the extent of their efficiency is still questionable. On the other hand, there is a feeling that these methods are environmentally friendly and safer for biodiversity conservation as compared to the conventional methods of termite control.  In order to curb loss of biodiversity experienced globally sustainable methods of termite control should be employed. Usually this loss is accelerated by continuous clearing of natural vegetation to accommodate the ever growing population particularly in Africa.  Depending on the methods used biodiversity can either be conserved or lost in the process. This review therefore explores the traditional uses of termites in Africa, elaborate on the common traditional methods used to control them.  It also details the advantages of these methods over conventional methods on biodiversity and environmental conservation. Key words: Termites, Biodiversity, Africa, Traditiona

    Effect of an opt-out point-of-care HIV-1 nucleic acid testing intervention to detect acute and prevalent HIV infection in symptomatic adult outpatients and reduce HIV transmission in Kenya: a randomized controlled trial

    Get PDF
    Background: In sub-Saharan Africa, adult outpatients with symptoms of acute infectious illness are not routinely tested for prevalent or acute HIV infection (AHI) when seeking healthcare. Methods: Adult symptomatic outpatients aged 18–39 years were evaluated by a consensus AHI risk score. Patients with a risk score ≄ 2 and no previous HIV diagnosis were enrolled in a stepped-wedge trial of opt-out delivery of point-of-care (POC) HIV-1 nucleic acid testing (NAAT), compared with standard provider-initiated HIV testing using rapid tests in the observation period. The primary outcome was the number of new diagnoses in each study period. Generalized estimating equations with a log-binomial link and robust variance estimates were used to account for clustering by health facility. The trial is registered with ClinicalTrials.gov NCT03508908. Results: Between 2017 and 2020, 13 (0.9%) out of 1374 participants in the observation period and 37 (2.5%) out of 1500 participants in the intervention period were diagnosed with HIV infection. Of the 37 newly diagnosed cases in the intervention period, two (5.4%) had AHI. Participants in the opt-out intervention had a two-fold greater odds of being diagnosed with HIV (odds ratio = 2.2, 95% confidence interval: 1.39–3.51) after adjustment for factors imbalanced across study periods. Conclusions: Among symptomatic adults aged 18–39 years targeted by our POC NAAT intervention, we identified one chronic HIV infection for every 40 patients and one AHI patient for every 750 patients tested. Although AHI yield was low in this population, routinely offered opt-out testing could diagnose twice as many patients as an approach relying on provider discretion

    Farmers' knowledge and perceptions of the stunting disease of Napier grass in Western Kenya

    Get PDF
    Production of Napier grass, Pennisetum purpureum, the most important livestock fodder in Western Kenya, is severely constrained by Napier stunt (Ns) disease. Understanding farmers' knowledge, perceptions and practices is a prerequisite to establishing an effective disease management approach. Using a random sample of 150 farmers drawn from Bungoma, Busia and Teso districts of Western Kenya, this study sought to: (i) assess farmers' perceptions and knowledge of Ns disease, including its effects on the smallholder dairy industry; (ii) record farmers' current practices in managing Ns disease; and (iii) identify Ns disease management challenges and intervention opportunities in order to develop an efficient integrated disease management approach. The majority (867%) of the farmers were aware of Ns disease and observed that it was spreading rapidly in the region, which was perfectly predicted by farmers' access to agricultural information (marginal effect=0164), indicating a need for extension platforms for knowledge sharing among the industry stakeholders. The disease had affected Napier grass yields so most farmers could not feed their livestock on the amounts they produced, and they were buying Napier grass. Those who relied on income from selling Napier grass received less due to loss in productivity. Milk production had reduced by over 35%. The cause of the disease was unknown to the farmers, with no effective disease management strategy available to them. An integrated disease management approach needs to be developed to fit within the mixed farming systems, supported by simple decision aids

    Assessing Tuberculosis Case Fatality Ratio: A Meta-Analysis

    Get PDF
    Background: Recently, the tuberculosis (TB) Task Force Impact Measurement acknowledged the need to review the assumptions underlying the TB mortality estimates published annually by the World Health Organization (WHO). TB mortality is indirectly measured by multiplying estimated TB incidence with estimated case fatality ratio (CFR). We conducted a meta-analysis to estimate the TB case fatality ratio in TB patients having initiated TB treatment. Methods: We searched for eligible studies in the PubMed and Embase databases through March 4(th) 2011 and by reference listing of relevant review articles. Main analyses included the estimation of the pooled percentages of: a) TB patients dying due to TB after having initiated TB treatment and b) TB patients dying during TB treatment. Pooled percentages were estimated using random effects regression models on the combined patient population from all studies. Main Results: We identified 69 relevant studies of which 22 provided data on mortality due to TB and 59 provided data on mortality during TB treatment. Among HIV infected persons the pooled percentage of TB patients dying due to TB was 9.2% (95% Confidence Interval (CI): 3.7%-14.7%) and among HIV uninfected persons 3.0% (95% CI: 21.2%-7.4%) based on the results of eight and three studies respectively providing data for this analyses. The pooled percentage of TB patients dying during TB treatment was 18.8% (95% CI: 14.8%-22.8%) among HIV infected patients and 3.5% (95% CI: 2.0%-4.92%) among HIV uninfected patients based on the results of 27 and 19 studies respectively. Conclusion: The results of the literature review are useful in generating prior distributions of CFR in countries with vital registration systems and have contributed towards revised estimates of TB mortality This literature review did not provide us with all data needed for a valid estimation of TB CFR in TB patients initiating TB treatmen

    The cost‐effectiveness of prophylaxis strategies for individuals with advanced HIV starting treatment in Africa

    Get PDF
    Introduction Many HIV‐positive individuals in Africa have advanced disease when initiating antiretroviral therapy (ART) so have high risks of opportunistic infections and death. The REALITY trial found that an enhanced‐prophylaxis package including fluconazole reduced mortality by 27% in individuals starting ART with CD4 <100 cells/mm3. We investigated the cost‐effectiveness of this enhanced‐prophylaxis package versus other strategies, including using cryptococcal antigen (CrAg) testing, in individuals with CD4 <200 cells/mm3 or <100 cells/mm3 at ART initiation and all individuals regardless of CD4 count. Methods The REALITY trial enrolled from June 2013 to April 2015. A decision‐analytic model was developed to estimate the cost‐effectiveness of six management strategies in individuals initiating ART in the REALITY trial countries. Strategies included standard‐prophylaxis, enhanced‐prophylaxis, standard‐prophylaxis with fluconazole; and three CrAg testing strategies, the first stratifying individuals to enhanced‐prophylaxis (CrAg‐positive) or standard‐prophylaxis (CrAg‐negative), the second to enhanced‐prophylaxis (CrAg‐positive) or enhanced‐prophylaxis without fluconazole (CrAg‐negative) and the third to standard‐prophylaxis with fluconazole (CrAg‐positive) or without fluconazole (CrAg‐negative). The model estimated costs, life‐years and quality‐adjusted life‐years (QALY) over 48 weeks using three competing mortality risks: cryptococcal meningitis; tuberculosis, serious bacterial infection or other known cause; and unknown cause. Results Enhanced‐prophylaxis was cost‐effective at cost‐effectiveness thresholds of US300andUS300 and US500 per QALY with an incremental cost‐effectiveness ratio (ICER) of US157perQALYintheCD4<200cells/mm3populationprovidingenhanced‐prophylaxiscomponentsaresourcedatlowestavailableprices.TheICERreducedinmoreseverelyimmunosuppressedindividuals(US157 per QALY in the CD4 <200 cells/mm3 population providing enhanced‐prophylaxis components are sourced at lowest available prices. The ICER reduced in more severely immunosuppressed individuals (US113 per QALY in the CD4 <100 cells/mm3 population) and increased in all individuals regardless of CD4 count (US722perQALY).Resultsweresensitivetopricesoftheenhanced‐prophylaxiscomponents.Enhanced‐prophylaxiswasmoreeffectiveandlesscostlythanallCrAgtestingstrategiesasenhanced‐prophylaxisstillconveyedhealthgainsinCrAg‐negativepatientsandsavingsfromtargetingprophylaxisbasedonCrAgstatusdidnotcompensateforcostsofCrAgtesting.CrAgtestingstrategiesdidnotbecomecost‐effectiveunlessthepriceofCrAgtestingfellbelowUS722 per QALY). Results were sensitive to prices of the enhanced‐prophylaxis components. Enhanced‐prophylaxis was more effective and less costly than all CrAg testing strategies as enhanced‐prophylaxis still conveyed health gains in CrAg‐negative patients and savings from targeting prophylaxis based on CrAg status did not compensate for costs of CrAg testing. CrAg testing strategies did not become cost‐effective unless the price of CrAg testing fell below US2.30. Conclusions The REALITY enhanced‐prophylaxis package in individuals with advanced HIV starting ART reduces morbidity and mortality, is practical to administer and is cost‐effective. Efforts should continue to ensure that components are accessed at lowest available prices

    Late Presentation With HIV in Africa: Phenotypes, Risk, and Risk Stratification in the REALITY Trial.

    Get PDF
    This article has been accepted for publication in Clinical Infectious Diseases Published by Oxford University PressBackground: Severely immunocompromised human immunodeficiency virus (HIV)-infected individuals have high mortality shortly after starting antiretroviral therapy (ART). We investigated predictors of early mortality and "late presenter" phenotypes. Methods: The Reduction of EArly MortaLITY (REALITY) trial enrolled ART-naive adults and children ≄5 years of age with CD4 counts .1). Results: Among 1711 included participants, 203 (12%) died. Mortality was independently higher with older age; lower CD4 count, albumin, hemoglobin, and grip strength; presence of World Health Organization stage 3/4 weight loss, fever, or vomiting; and problems with mobility or self-care at baseline (all P < .04). Receiving enhanced antimicrobial prophylaxis independently reduced mortality (P = .02). Of five late-presenter phenotypes, Group 1 (n = 355) had highest mortality (25%; median CD4 count, 28 cells/”L), with high symptom burden, weight loss, poor mobility, and low albumin and hemoglobin. Group 2 (n = 394; 11% mortality; 43 cells/”L) also had weight loss, with high white cell, platelet, and neutrophil counts suggesting underlying inflammation/infection. Group 3 (n = 218; 10% mortality) had low CD4 counts (27 cells/”L), but low symptom burden and maintained fat mass. The remaining groups had 4%-6% mortality. Conclusions: Clinical and laboratory features identified groups with highest mortality following ART initiation. A screening tool could identify patients with low CD4 counts for prioritizing same-day ART initiation, enhanced prophylaxis, and intensive follow-up. Clinical Trials Registration: ISRCTN43622374.REALITY was funded by the Joint Global Health Trials Scheme (JGHTS) of the UK Department for International Development, the Wellcome Trust, and Medical Research Council (MRC) (grant number G1100693). Additional funding support was provided by the PENTA Foundation and core support to the MRC Clinical Trials Unit at University College London (grant numbers MC_UU_12023/23 and MC_UU_12023/26). Cipla Ltd, Gilead Sciences, ViiV Healthcare/GlaxoSmithKline, and Merck Sharp & Dohme donated drugs for REALITY, and ready-to-use supplementary food was purchased from Valid International. A. J. P. is funded by the Wellcome Trust (grant number 108065/Z/15/Z). J. A. B. is funded by the JGHTS (grant number MR/M007367/1). The Malawi-Liverpool–Wellcome Trust Clinical Research Programme, University of Malawi College of Medicine (grant number 101113/Z/13/Z) and the Kenya Medical Research Institute (KEMRI)/Wellcome Trust Research Programme, Kilifi (grant number 203077/Z/16/Z) are supported by strategic awards from the Wellcome Trust, United Kingdom. Permission to publish was granted by the Director of KEMRI. This supplement was supported by funds from the Bill & Melinda Gates Foundation

    Changes in hydro-meteorological conditions over tropical West Africa (1980–2015) and links to global climate

    No full text
    © 2018 Elsevier B.V. The role of global sea surface temperature (SST) anomalies in modulating rainfall in the African region has been widely studied and is now less debated. However, their impacts and links to terrestrial water storage (TWS) in general, have not been studied. This study presents the pioneer results of canonical correlation analysis (CCA) of TWS derived from both global reanalysis data (1980–2015) and GRACE (Gravity Recovery and Climate Experiment) (2002–2014) with SST fields. The main issues discussed include, (i) oceanic hot spots that impact on TWS over tropical West Africa (TWA) based on CCA, (ii) long term changes in model and global reanalysis data (soil moisture, TWS, and groundwater) and the influence of climate variability on these hydrological indicators, and (iii) the hydrological characteristics of the Equatorial region of Africa (i.e., the Congo basin) based on GRACE-derived TWS, river discharge, and precipitation. Results of the CCA diagnostics show that El-Niño Southern Oscillation related equatorial Pacific SST fluctuations is a major index of climate variability identified in the main portion of the CCA procedure that indicates a significant association with long term TWS reanalysis data over TWA (r = 0.50, ? &lt; 0.05). Based on Mann-Kendall's statistics, the study found fairly large long term declines (? &lt; 0.05) in TWS and soil moisture (1982 - 2015), mostly over the Congo basin, which coincided with warming of the land surface and the surrounding oceans. Meanwhile, some parts of the Sahel show significant wetting (rainfall, soil moisture, groundwater, and TWS) trends during the same period (1982–2015) and aligns with the ongoing narratives of rainfall recovery in the region. Results of singular spectral analysis and regression confirm that multi-annual changes in the Congo River discharge explained a considerable proportion of variability in GRACE-hydrological signal over the Congo basin (r = 0.86 and R 2 = 0.70, ? &lt; 0.05). Finally, leading orthogonal modes of MERRA and GRACE-TWS over TWA show significant association with global SST anomalies

    Consistency of agricultural drought characterization over Upper Greater Horn of Africa (1982–2013): Topographical, gauge density, and model forcing influence.

    No full text
    The negative impact of Upper Greater Horn of Africa's (UGHA) complex topography on drought characterization exacerbated by gauge density and model forcing parameters has not been investigated. In order to fill this gap, this study employs a combination of remotely sensed, in situ, and model products (1982–2013); precipitation (CHIRPS, GPCC, and CHIRP), soil moisture (ERA-Interim, MERRA-2, CPC, GLDAS, and FLDAS), vegetation condition index (VCI), and total water storage products (GRACE and MERRA-2) to (i) characterize drought, (ii) explore the inconsistencies in areas under drought due to topographical variations, gauge density, and model forcing parameters, and (iii), assess the effectiveness of various drought indicators over Ethiopia (a selected UGHA region with unique topographical variation). A 3-month time scale that sufficiently captures agricultural drought is employed to provide an indirect link to food security situation in this rain-dependent region. The spatio-temporal drought patterns across all the products are found to be dependent on topography of the region, at the same time, the inconsistencies in characterizing drought is found to be mainly driven by topographical variability (directly) and gauge density (inversely) for precipitation products while for soil moisture products, precipitation forcing parameters plays a major role. In addition, the inconsistencies are found to be higher under extreme and moderate droughts than severe droughts. The mean differences in the percentage of areas under drought and different drought intensities over the region are on average 15.87% and 6.16% (from precipitation products) and 12.65% and 5.20% (from soil moisture products), respectively. On the effectiveness of various indicators, for the duration under study, the following were found to be most suitable over Ethiopia; VCI, GPCC, ERA, CPC, and FLDAS. These results are critical in putting into perspective drought analysis outcomes from various products

    Assessing multi-satellite remote sensing, reanalysis, and land surface models’ products in characterizing agricultural drought in East Africa

    Get PDF
    © 2017 Elsevier Inc. Heavy reliance of East Africa (EA) on rain-fed agriculture makes it vulnerable to drought-induced famine. Yet, most resear ch on EA drought focuses on meteorological aspects with little attention paid on agricultural drought impacts. The inadequacy of in-situ rainfall data across EA has also hampered detailed agricultural drought impact analysis. Recently, however, there has been increased data availability from remote sensing (rainfall, vegetation condition index – VCI, terrestrial water storage – TWS), reanalysis (soil moisture and TWS), and land surface models (soil moisture). Here, these products were employed to characterise EA droughts between 1983 and 2013 in terms of severity, duration, and spatial extent. Furthermore, the capability of these products to capture agricultural drought impacts was assessed using maize and wheat production data. Our results show that while all products were similar in drought characterisation in dry areas, the similarity of CHIRPS and GPCC extended over the whole EA. CHIRPS and GPCC also identified the highest proportion of areas under drought followed closely by soil moisture products whereas VCI had the least coverage. Drought onset was marked first by a decline/lack of rainfall, followed by VCI/soil moisture, and then TWS. VCI indicated drought lag at 0–4 months following rainfall while soil moisture and TWS products had variable lags vis-Ă -vis rainfall. GLDAS mischaracterized the 2005–2006 drought vis-Ă -vis other soil moisture products. Based on the annual crop production variabilities explained, we identified CHIRPS, GPCC, FLDAS, and VCI as suitable for agricultural drought monitoring/characterization in the region for the study period. Finally, GLDAS explained the lowest percentages of the Kenyan and Ugandan annual crop production variances. These findings are important for the gauge data deficient EA region as they provide alternatives for monitoring agricultural drought
    corecore