146 research outputs found

    The effect of AIDS defining conditions on immunological recovery among patients initiating antiretroviral therapy at Joint Clinical Research Centre, Uganda

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Many HIV-infected patients only access health care once they have developed advanced symptomatic disease resulting from AIDS Defining Conditions (ADCs). We carried out a study to establish the effect of ADCs on immunological recovery among patients initiated on antiretroviral therapy (ART).</p> <p>Methods</p> <p>A retrospective cohort of 427 HIV-1 patients who were initiated on ART between January 2002 and December 2006 was studied. Data on ADCs was retrieved from Joint Clinical Research Centre (JCRC) data base and backed up by chart reviews. We employed Kaplan-Meier survival curves to estimate median time to 50 CD4 cells/μl from the baseline value to indicate a good immunological recovery process. Cox proportional hazard models were used at multivariate analysis.</p> <p>Results</p> <p>The median time to gaining 50 CD4 cells/μl from the baseline value after ART initiation was longer in the ADC (9.3 months) compared to the non-ADC group (6.9 months) (log rank test, p = 0.027). At multivariate analysis after adjusting for age, sex, baseline CD4 count, baseline HIV viral load, total lymphocyte count and adherence level, factors that shortened the median time to immunological recovery after ART initiation were belonging to the non-ADC group (HR = 1.31; 95% CI: 1.03–1.28, p = 0.028), adherence to ART of ≥ 95% (HR = 2.22; 95% CI: 1.57–3.15, p = 0.001) and a total lymphocyte count ≥ 1200 cells/mm<sup>3 </sup>(HR = 1.84; 95% CI: 1.22–2.78, p = 0.003). A low baseline CD4 count of ≤ 200 cells/μl (HR = 0.52; 95% CI: 0.37–0.77, p = 0.001) was associated with a longer time to immunological recovery. There was no interaction between low CD4 counts and ADC group.</p> <p>Conclusion</p> <p>Patients with ADCs take longer to regain their CD4 counts due to the defect in the immune system. This may prolong their risk of morbidity and mortality.</p

    CD4 T cell activation as a predictor for treatment failure in Ugandans with Plasmodium falciparum malaria.

    Get PDF
    Host immunity plays an important role in response to antimalarial therapy but is poorly understood. To test whether T cell activation is a risk factor for antimalarial treatment failure, we studied CD4(+) and CD8(+) T cell activation in 31 human immunodeficiency virus-negative Ugandan patients 5-37 years of age who were treated for uncomplicated Plasmodium falciparum malaria. Increased CD4(+) T cell activation, as indicated by co-expression of HLA-DR and CD38, was an independent risk factor for treatment failure (hazard ratio = 2.45, 95% confidence interval = 1.02-5.89, P = 0.05) in multivariate analysis controlling for age, baseline temperature, and pre-treatment parasite density. The results provide insight into the role of cellular immunity in response to antimalarial therapy and underscore the need to investigate the mechanisms behind immune activation

    HIV-associated anemia after 96 weeks on therapy: determinants across age ranges in Uganda and Zimbabwe.

    Get PDF
    Given the detrimental effects of HIV-associated anemia on morbidity, we determined factors associated with anemia after 96 weeks of antiretroviral therapy (ART) across age groups. An HIV-positive cohort (n=3,580) of children age 5-14, reproductive age adults 18-49, and older adults ≥50 from two randomized trials in Uganda and Zimbabwe were evaluated from initiation of therapy through 96 weeks. We conducted logistic and multinomial regression to evaluate common and differential determinants for anemia at 96 weeks on therapy. Prior to initiation of ART, the prevalence of anemia (age 5-11 <10.5 g/dl, 12-14 <11 g/dl, adult females <11 g/dl, adult males <12 g/dl) was 43%, which decreased to 13% at week 96 (p<0.001). Older adults had a significantly higher likelihood of anemia compared to reproductive age adults (OR 2.60, 95% CI 1.44-4.70, p=0.002). Reproductive age females had a significantly higher odds of anemia compared to men at week 96 (OR 2.56, 95% CI 1.92-3.40, p<0.001), and particularly a greater odds for microcytic anemia compared to males in the same age group (p=0.001). Other common factors associated with anemia included low body mass index (BMI) and microcytosis; greater increases in CD4 count to week 96 were protective. Thus, while ART significantly reduced the prevalence of anemia at 96 weeks, 13% of the population continued to be anemic. Specific groups, such as reproductive age females and older adults, have a greater odds of anemia and may guide clinicians to pursue further evaluation and management

    Limited antigenic diversity of Plasmodium falciparum apical membrane antigen 1 supports the development of effective multi-allele vaccines

    Get PDF
    BackgroundPolymorphism in antigens is a common mechanism for immune evasion used by many important pathogens, and presents major challenges in vaccine development. In malaria, many key immune targets and vaccine candidates show substantial polymorphism. However, knowledge on antigenic diversity of key antigens, the impact of polymorphism on potential vaccine escape, and how sequence polymorphism relates to antigenic differences is very limited, yet crucial for vaccine development. Plasmodium falciparum apical membrane antigen 1 (AMA1) is an important target of naturally-acquired antibodies in malaria immunity and a leading vaccine candidate. However, AMA1 has extensive allelic diversity with more than 60 polymorphic amino acid residues and more than 200 haplotypes in a single population. Therefore, AMA1 serves as an excellent model to assess antigenic diversity in malaria vaccine antigens and the feasibility of multi-allele vaccine approaches. While most previous research has focused on sequence diversity and antibody responses in laboratory animals, little has been done on the cross-reactivity of human antibodies.MethodsWe aimed to determine the extent of antigenic diversity of AMA1, defined by reactivity with human antibodies, and to aid the identification of specific alleles for potential inclusion in a multi-allele vaccine. We developed an approach using a multiple-antigen-competition enzyme-linked immunosorbent assay (ELISA) to examine cross-reactivity of naturally-acquired antibodies in Papua New Guinea and Kenya, and related this to differences in AMA1 sequence.ResultsWe found that adults had greater cross-reactivity of antibodies than children, although the patterns of cross-reactivity to alleles were the same. Patterns of antibody cross-reactivity were very similar between populations (Papua New Guinea and Kenya), and over time. Further, our results show that antigenic diversity of AMA1 alleles is surprisingly restricted, despite extensive sequence polymorphism. Our findings suggest that a combination of three different alleles, if selected appropriately, may be sufficient to cover the majority of antigenic diversity in polymorphic AMA1 antigens. Antigenic properties were not strongly related to existing haplotype groupings based on sequence analysis.ConclusionsAntigenic diversity of AMA1 is limited and a vaccine including a small number of alleles might be sufficient for coverage against naturally-circulating strains, supporting a multi-allele approach for developing polymorphic antigens as malaria vaccines

    Diminishing Availability of Publicly Funded Slots for Antiretroviral Initiation among HIV-Infected ART-Eligible Patients in Uganda

    Get PDF
    Background: The impact of flat-line funding in the global scale up of antiretroviral therapy (ART) for HIV-infected patients in Africa has not yet been well described. Methods: We evaluated ART-eligible patients and patients starting ART at a prototypical scale up ART clinic in Mbarara, Uganda between April 1, 2009 and May 14, 2010 where four stakeholders sponsor treatment – two PEPFAR implementing organizations, the Ugandan Ministry of Health – Global Fund (MOH-GF) and a private foundation named the Family Treatment Fund (FTF). We assessed temporal trends in the number of eligible patients, the number starting ART and tabulated the distribution of the stakeholders supporting ART initiation by month and quartile of time during this interval. We used survival analyses to assess changes in the rate of ART initiation over calendar time. Findings: A total of 1309 patients who were eligible for ART made visits over the 14 month period of the study and of these 819 started ART. The median number of ART eligible patients each month was 88 (IQR: 74 to 115). By quartile of calendar time, PEPFAR and MOH sponsored 290, 192, 180, and 49 ART initiations whereas the FTF started 1, 2, 1 and 104 patients respectively. By May of 2010 (the last calendar month of observation) FTF sponsored 88% of all ART initiations. Becoming eligible for ART in the 3rd (HR = 0.58, 95% 0.45–0.74) and 4th quartiles (HR = 0.49, 95% CI: 0.36–0.65) was associated with delay in ART initiation compared to the first quartile in multivariable analyses. Interpretation: During a period of flat line funding from multinational donors for ART programs, reductions in the number of ART initiations by public programs (i.e., PEPFAR and MOH-GF) and delays in ART initiation became apparent at the a large prototypical scale-up ART clinic in Uganda

    Cost effectiveness analysis of clinically driven versus routine laboratory monitoring of antiretroviral therapy in Uganda and Zimbabwe.

    Get PDF
    BACKGROUND: Despite funding constraints for treatment programmes in Africa, the costs and economic consequences of routine laboratory monitoring for efficacy and toxicity of antiretroviral therapy (ART) have rarely been evaluated. METHODS: Cost-effectiveness analysis was conducted in the DART trial (ISRCTN13968779). Adults in Uganda/Zimbabwe starting ART were randomised to clinically-driven monitoring (CDM) or laboratory and clinical monitoring (LCM); individual patient data on healthcare resource utilisation and outcomes were valued with primary economic costs and utilities. Total costs of first/second-line ART, routine 12-weekly CD4 and biochemistry/haematology tests, additional diagnostic investigations, clinic visits, concomitant medications and hospitalisations were considered from the public healthcare sector perspective. A Markov model was used to extrapolate costs and benefits 20 years beyond the trial. RESULTS: 3316 (1660LCM;1656CDM) symptomatic, immunosuppressed ART-naive adults (median (IQR) age 37 (32,42); CD4 86 (31,139) cells/mm(3)) were followed for median 4.9 years. LCM had a mean 0.112 year (41 days) survival benefit at an additional mean cost of 765[95765 [95%CI:685,845], translating into an adjusted incremental cost of 7386 [3277,dominated] per life-year gained and 7793[4442,39179]perqualityadjustedlifeyeargained.Routinetoxicitytestswereprominentcostdriversandhadnobenefit.With12weeklyCD4monitoringfromyear2onART,lowcostsecondlineART,butwithouttoxicitymonitoring,CD4testcostsneedtofallbelow7793 [4442,39179] per quality-adjusted life year gained. Routine toxicity tests were prominent cost-drivers and had no benefit. With 12-weekly CD4 monitoring from year 2 on ART, low-cost second-line ART, but without toxicity monitoring, CD4 test costs need to fall below 3.78 to become cost-effective (<3xper-capita GDP, following WHO benchmarks). CD4 monitoring at current costs as undertaken in DART was not cost-effective in the long-term. CONCLUSIONS: There is no rationale for routine toxicity monitoring, which did not affect outcomes and was costly. Even though beneficial, there is little justification for routine 12-weekly CD4 monitoring of ART at current test costs in low-income African countries. CD4 monitoring, restricted to the second year on ART onwards, could be cost-effective with lower cost second-line therapy and development of a cheaper, ideally point-of-care, CD4 test

    The impact of different CD4 monitoring and switching strategies on mortality in HIV-infected African adults on antiretroviral therapy; an application of dynamic marginal structural models

    Get PDF
    In Africa, antiretroviral therapy (ART) is delivered with limited laboratory monitoring, often none. In 2003–2004, investigators in the Development of Antiretroviral Therapy in Africa (DART) Trial randomized persons initiating ART in Uganda and Zimbabwe to either laboratory and clinical monitoring (LCM) or clinically driven monitoring (CDM). CD4 cell counts were measured every 12 weeks in both groups but were only returned to treating clinicians for management in the LCM group. Follow-up continued through 2008. In observational analyses, dynamic marginal structural models on pooled randomized groups were used to estimate survival under different monitoring-frequency and clinical/immunological switching strategies. Assumptions included no direct effect of randomized group on mortality or confounders and no unmeasured confounders which influenced treatment switch and mortality or treatment switch and time-dependent covariates. After 48 weeks of first-line ART, 2,946 individuals contributed 11,351 person-years of follow-up, 625 switches, and 179 deaths. The estimated survival probability after a further 240 weeks for post-48-week switch at the first CD4 cell count less than 100 cells/mm3 or non-Candida World Health Organization stage 4 event (with CD4 count <250) was 0.96 (95% confidence interval (CI): 0.94, 0.97) with 12-weekly CD4 testing, 0.96 (95% CI: 0.95, 0.97) with 24-weekly CD4 testing, 0.95 (95% CI: 0.93, 0.96) with a single CD4 test at 48 weeks (baseline), and 0.92 (95% CI: 0.91, 0.94) with no CD4 testing. Comparing randomized groups by 48-week CD4 count, the mortality risk associated with CDM versus LCM was greater in persons with CD4 counts of <100 (hazard ratio = 2.4, 95% CI: 1.3, 4.3) than in those with CD4 counts of ≥100 (hazard ratio = 1.1, 95% CI: 0.8, 1.7; interaction P = 0.04). These findings support a benefit from identifying patients immunologically failing first-line ART at 48 weeks

    Transmission of HIV-1 infection in sub-Saharan Africa and effect of elimination of unsafe injections

    Get PDF
    During the past year, a group has argued that unsafe injections are a major if not the main mode of HIV-1 transmission\ud in sub-Saharan Africa. We review the main arguments used to question the epidemiological interpretations on the lead\ud role of unsafe sex in HIV-1 transmission, and conclude there is no compelling evidence that unsafe injections are a\ud predominant mode of HIV-1 transmission in sub-Saharan Africa. Conversely, though there is a clear need to eliminate\ud all unsafe injections, epidemiological evidence indicates that sexual transmission continues to be by far the major\ud mode of spread of HIV-1 in the region. Increased efforts are needed to reduce sexual transmission of HIV-1

    Barriers to Initiation of Pediatric HIV Treatment in Uganda: A Mixed-Method Study

    Get PDF
    Although the advantages of early infant HIV diagnosis and treatment initiation are well established, children often present late to HIV programs in resource-limited settings. We aimed to assess factors related to the timing of treatment initiation among HIV-infected children attending three clinical sites in Uganda. Clinical and demographic determinants associated with early disease (WHO clinical stages 1-2) or late disease (stages 3-4) stage at presentation were assessed using multilevel logistic regression. Additionally, semistructured interviews with caregivers and health workers were conducted to qualitatively explore determinants of late disease stage at presentation. Of 306 children initiating first-line regimens, 72% presented late. Risk factors for late presentation were age below 2 years old (OR 2.83, P = 0.014), living without parents (OR 3.93, P = 0.002), unemployment of the caregiver (OR 4.26, P = 0.001), lack of perinatal HIV prophylaxis (OR 5.66, P = 0.028), and high transportation costs to the clinic (OR 2.51, P = 0.072). Forty-nine interviews were conducted, confirming the identified risk factors and additionally pointing to inconsistent referral from perinatal care, caregivers' unawareness of HIV symptoms, fear, and stigma as important barriers. The problem of late disease at presentation requires a multifactorial approach, addressing both health system and individual-level factors
    corecore