186 research outputs found

    HIV-associated anemia after 96 weeks on therapy: determinants across age ranges in Uganda and Zimbabwe.

    Get PDF
    Given the detrimental effects of HIV-associated anemia on morbidity, we determined factors associated with anemia after 96 weeks of antiretroviral therapy (ART) across age groups. An HIV-positive cohort (n=3,580) of children age 5-14, reproductive age adults 18-49, and older adults ≥50 from two randomized trials in Uganda and Zimbabwe were evaluated from initiation of therapy through 96 weeks. We conducted logistic and multinomial regression to evaluate common and differential determinants for anemia at 96 weeks on therapy. Prior to initiation of ART, the prevalence of anemia (age 5-11 <10.5 g/dl, 12-14 <11 g/dl, adult females <11 g/dl, adult males <12 g/dl) was 43%, which decreased to 13% at week 96 (p<0.001). Older adults had a significantly higher likelihood of anemia compared to reproductive age adults (OR 2.60, 95% CI 1.44-4.70, p=0.002). Reproductive age females had a significantly higher odds of anemia compared to men at week 96 (OR 2.56, 95% CI 1.92-3.40, p<0.001), and particularly a greater odds for microcytic anemia compared to males in the same age group (p=0.001). Other common factors associated with anemia included low body mass index (BMI) and microcytosis; greater increases in CD4 count to week 96 were protective. Thus, while ART significantly reduced the prevalence of anemia at 96 weeks, 13% of the population continued to be anemic. Specific groups, such as reproductive age females and older adults, have a greater odds of anemia and may guide clinicians to pursue further evaluation and management

    A mixed methods and triangulation model for increasing the accuracy of adherence and sexual behaviour data: the Microbicides Development Programme.

    Get PDF
    Background: The collection of accurate data on adherence and sexual behaviour is crucial in microbicide (and other HIV-related) research. In the absence of a "gold standard" the collection of such data relies largely on participant self-reporting. After reviewing available methods, this paper describes a mixed method/triangulation model for generating more accurate data on adherence and sexual behaviour in a multi-centre vaginal microbicide clinical trial. In a companion paper some of the results from this model are presented [1].Methodology/Principal Findings: Data were collected from a random subsample of 725 women (7.7% of the trial population) using structured interviews, coital diaries, in-depth interviews, counting returned gel applicators, focus group discussions, and ethnography. The core of the model was a customised, semi-structured in-depth interview. There were two levels of triangulation: first, discrepancies between data from the questionnaires, diaries, in-depth interviews and applicator returns were identified, discussed with participants and, to a large extent, resolved; second, results from individual participants were related to more general data emerging from the focus group discussions and ethnography. A democratic and equitable collaboration between clinical trialists and qualitative social scientists facilitated the success of the model, as did the preparatory studies preceding the trial. The process revealed some of the underlying assumptions and routinised practices in "clinical trial culture" that are potentially detrimental to the collection of accurate data, as well as some of the shortcomings of large qualitative studies, and pointed to some potential solutions.Conclusions/Significance: The integration of qualitative social science and the use of mixed methods and triangulation in clinical trials are feasible, and can reveal (and resolve) inaccuracies in data on adherence and sensitive behaviours, as well as illuminating aspects of "trial culture" that may also affect data accuracy

    Cost effectiveness analysis of clinically driven versus routine laboratory monitoring of antiretroviral therapy in Uganda and Zimbabwe.

    Get PDF
    BACKGROUND: Despite funding constraints for treatment programmes in Africa, the costs and economic consequences of routine laboratory monitoring for efficacy and toxicity of antiretroviral therapy (ART) have rarely been evaluated. METHODS: Cost-effectiveness analysis was conducted in the DART trial (ISRCTN13968779). Adults in Uganda/Zimbabwe starting ART were randomised to clinically-driven monitoring (CDM) or laboratory and clinical monitoring (LCM); individual patient data on healthcare resource utilisation and outcomes were valued with primary economic costs and utilities. Total costs of first/second-line ART, routine 12-weekly CD4 and biochemistry/haematology tests, additional diagnostic investigations, clinic visits, concomitant medications and hospitalisations were considered from the public healthcare sector perspective. A Markov model was used to extrapolate costs and benefits 20 years beyond the trial. RESULTS: 3316 (1660LCM;1656CDM) symptomatic, immunosuppressed ART-naive adults (median (IQR) age 37 (32,42); CD4 86 (31,139) cells/mm(3)) were followed for median 4.9 years. LCM had a mean 0.112 year (41 days) survival benefit at an additional mean cost of 765[95765 [95%CI:685,845], translating into an adjusted incremental cost of 7386 [3277,dominated] per life-year gained and 7793[4442,39179]perquality−adjustedlifeyeargained.Routinetoxicitytestswereprominentcost−driversandhadnobenefit.With12−weeklyCD4monitoringfromyear2onART,low−costsecond−lineART,butwithouttoxicitymonitoring,CD4testcostsneedtofallbelow7793 [4442,39179] per quality-adjusted life year gained. Routine toxicity tests were prominent cost-drivers and had no benefit. With 12-weekly CD4 monitoring from year 2 on ART, low-cost second-line ART, but without toxicity monitoring, CD4 test costs need to fall below 3.78 to become cost-effective (<3xper-capita GDP, following WHO benchmarks). CD4 monitoring at current costs as undertaken in DART was not cost-effective in the long-term. CONCLUSIONS: There is no rationale for routine toxicity monitoring, which did not affect outcomes and was costly. Even though beneficial, there is little justification for routine 12-weekly CD4 monitoring of ART at current test costs in low-income African countries. CD4 monitoring, restricted to the second year on ART onwards, could be cost-effective with lower cost second-line therapy and development of a cheaper, ideally point-of-care, CD4 test

    Selection of cooking banana genotypes for yield and black Sigatoka resistance in different locations in Uganda

    Get PDF
    It is imperative to systematically evaluate new banana genotypes in different locations before national release. This enables selection and recommendation of superior genotypes as new varieties for a wider range of environments. The objective of the present study was to select banana genotypes with stable and high performance for bunch yield and leaf black Sigatoka resistance. Eleven cooking banana genotypes developed by the Uganda National Agricultural Research Organization in collaboration with Bioversity International, and two check varieties were evaluated in multi-location preliminary yield trials in Uganda. Data collected were analyzed using Additive Main Effects and Multiplicative Interaction (AMMI) model, AMMI Stability Value, and Genotype Selection Index (GSI). Genotype × location interaction was significant for all the traits assessed. Most of the new genotypes had low interaction effects with locations for bunch yield (69.2%) and black Sigatoka (92.3%). The most stable genotypes for bunch yield were NABIO815, NABIO1117, NABIO216 and NABIO306 whereas for black Sigatoka resistance, were NABIO1011, NABIO815, NABIO1009 and NABIO216. Using the GSI that defines the most desirable genotypes as those that combine high agronomic performance and stability across environments, four genotypes (NABIO306, NABIO1011, NABIO808 and NABIO1009) were selected. These genotypes, in addition to their high performance for agronomic traits and stability, had soft and yellow fruit pulp on cooking, and will be advanced on farm for further evaluatio

    Willingness to pay for quality traits and implications for sweetpotato variety breeding: case of Mozambique

    Get PDF
    Despite decades of research and dissemination of improved sweetpotato varieties, uptake at scale remains low and envisaged development goals of food security and livelihoods remain elusive. This is despite demonstrated impacts of such technologies in combating food and nutrition insecurity, amidst global challenges like climate change. Growing evidence show that end-user acceptance of improved varieties is critical in the widespread adoption of such varieties, and inclusion of the heterogenous preferences of diverse sets of end-users in the variety development process is therefore critical. With global changes in weather and consumption patterns, end-users are now demanding varieties that are more suitable to their unique consumption needs, production environments, new market demands and have desired processing characteristics. Such dynamics in demand have necessitated rethinking of breeding programs from the traditional focus on agronomic gains such as increase in yields and yield protection, to consideration of more nuanced quality-related traits that appeal to targeted populations. Against this background, this study sets out to explore the decision-making behavior of Mozambican sweetpotato producers in variety selection, and the implicit value placed on different sweetpotato traits, including the often ignored but crucial quality traits. The aim of the study is to identify the economic valuation of such traits and how they are traded off in variety selection decisions, to allow for prioritization in breeding efforts. To achieve this, an exploratory sequential design in a predominantly quantitative mixed-method design was adopted for the study. First, based on the insights from a gender disaggregated qualitative assessment among sweetpotato growers and consumers and in consultation with breeding experts from Mozambique, the most preferred sweetpotato variety traits in the regions of study were established. These traits were then utilized in the design of a choice experiment, implemented among 860 sweetpotato producers spread across four sweetpotato growing regions in the country. Finally, a generalized multinomial logit model was used to estimate implicit economic valuation of each of the considered trait, as well as heterogenous valuation of such traits across gender, education and age of respondent groups. Results from the study show that producers have a high preference for quality-related traits, with preference for Vitamin A being higher than that for drought tolerance, while dry matter content is valued about the same as drought tolerance. While scoring significantly lower than Vitamin A, drought tolerance and dry matter content, other quality-related traits like root size and sweet taste also have significant positive values implying their importance in informing sweetpotato variety choice. In terms of gender heterogeneity, flesh color is highly valued among the women sub-sample. The study identifies Vitamin A, dry matter content, sweet taste, and medium to big root size, as the key preferred quality traits in Mozambique, in that order. The results imply that these quality traits should be pursued as a suite in breeding objectives, in combination with essential agronomic traits such as high yields and drought tolerance, for higher acceptance and demand of improved sweetpotato varieties across the country

    Monitoring changes in malaria epidemiology and effectiveness of interventions in Ethiopia and Uganda: Beyond Garki Project baseline survey.

    Get PDF
    Published onlineJournal ArticleResearch Support, Non-U.S. Gov'tBACKGROUND: Scale-up of malaria interventions seems to have contributed to a decline in the disease but other factors may also have had some role. Understanding changes in transmission and determinant factors will help to adapt control strategies accordingly. METHODS: Four sites in Ethiopia and Uganda were set up to monitor epidemiological changes and effectiveness of interventions over time. Here, results of a survey during the peak transmission season of 2012 are reported, which will be used as baseline for subsequent surveys and may support adaptation of control strategies. Data on malariometric and entomological variables, socio-economic status (SES) and control coverage were collected. RESULTS: Malaria prevalence varied from 1.4 % in Guba (Ethiopia) to 9.9 % in Butemba (Uganda). The most dominant species was Plasmodium vivax in Ethiopia and Plasmodium falciparum in Uganda. The majority of human-vector contact occurred indoors in Uganda, ranging from 83 % (Anopheles funestus sensu lato) to 93 % (Anopheles gambiae s.l.), which is an important factor for the effectiveness of insecticide-treated nets (ITNs) or indoor residual spraying (IRS). High kdr-L1014S (resistance genotype) frequency was observed in A. gambiae sensu stricto in Uganda. Too few mosquitoes were collected in Ethiopia, so it was not possible to assess vector habits and insecticide resistance levels. ITN ownership did not vary by SES and 56-98 % and 68-78 % of households owned at least one ITN in Ethiopia and Uganda, respectively. In Uganda, 7 % of nets were purchased by households, but the nets were untreated. In three of the four sites, 69-76 % of people with access to ITNs used them. IRS coverage ranged from 84 to 96 % in the three sprayed sites. Half of febrile children in Uganda and three-quarters in Ethiopia for whom treatment was sought received diagnostic tests. High levels of child undernutrition were detected in both countries carrying important implications on child development. In Uganda, 7-8 % of pregnant women took the recommended minimum three doses of intermittent preventive treatment. CONCLUSION: Malaria epidemiology seems to be changing compared to earlier published data, and it is essential to have more data to understand how much of the changes are attributable to interventions and other factors. Regular monitoring will help to better interpret changes, identify determinants, modify strategies and improve targeting to address transmission heterogeneity.UK aid (PPA

    Retrospective Review of Positive Newborn Screening Results for Isovaleric Acidemia and Development of a Strategy to Improve the Efficacy of Newborn Screening in the UK

    Get PDF
    \ua9 2024 by the authors.Since the UK commenced newborn screening for isovaleric acidemia in 2015, changes in prescribing have increased the incidence of false positive (FP) results due to pivaloylcarnitine. A review of screening results between 2015 and 2022 identified 24 true positive (TP) and 84 FP cases, with pivalate interference confirmed in 76/84. Initial C5 carnitine (C5C) did not discriminate between FP and TP with median (range) C5C of 2.9 (2.0–9.6) and 4.0 (1.8–&gt;70) \ub5mol/L, respectively, and neither did Precision Newborn Screening via Collaborative Laboratory Integrated Reports (CLIR), which identified only 1/47 FP cases. However, among the TP cases, disease severity showed a correlation with initial C5C in ‘asymptomatic’ individuals (n = 17), demonstrating a median (range) C5C of 3.0 (1.8–7.1) whilst ‘clinically affected’ patients (n = 7), showed a median (range) C5C of 13.9 (7.7–70) \ub5mol/L. These findings allowed the introduction of dual cut-off values into the screening algorithm to reduce the incidence of FPs, with initial C5C results ≥ 5 \ub5mol/L triggering urgent referral, and those &gt;2.0 and &lt;5.0 \ub5mol/L prompting second-tier C5-isobar testing. This will avoid delayed referral in babies at particular risk whilst reducing the FP rate for the remainder
    • …
    corecore