83 research outputs found

    HIV-associated anemia after 96 weeks on therapy: determinants across age ranges in Uganda and Zimbabwe.

    Get PDF
    Given the detrimental effects of HIV-associated anemia on morbidity, we determined factors associated with anemia after 96 weeks of antiretroviral therapy (ART) across age groups. An HIV-positive cohort (n=3,580) of children age 5-14, reproductive age adults 18-49, and older adults ≥50 from two randomized trials in Uganda and Zimbabwe were evaluated from initiation of therapy through 96 weeks. We conducted logistic and multinomial regression to evaluate common and differential determinants for anemia at 96 weeks on therapy. Prior to initiation of ART, the prevalence of anemia (age 5-11 <10.5 g/dl, 12-14 <11 g/dl, adult females <11 g/dl, adult males <12 g/dl) was 43%, which decreased to 13% at week 96 (p<0.001). Older adults had a significantly higher likelihood of anemia compared to reproductive age adults (OR 2.60, 95% CI 1.44-4.70, p=0.002). Reproductive age females had a significantly higher odds of anemia compared to men at week 96 (OR 2.56, 95% CI 1.92-3.40, p<0.001), and particularly a greater odds for microcytic anemia compared to males in the same age group (p=0.001). Other common factors associated with anemia included low body mass index (BMI) and microcytosis; greater increases in CD4 count to week 96 were protective. Thus, while ART significantly reduced the prevalence of anemia at 96 weeks, 13% of the population continued to be anemic. Specific groups, such as reproductive age females and older adults, have a greater odds of anemia and may guide clinicians to pursue further evaluation and management

    Cost effectiveness analysis of clinically driven versus routine laboratory monitoring of antiretroviral therapy in Uganda and Zimbabwe.

    Get PDF
    BACKGROUND: Despite funding constraints for treatment programmes in Africa, the costs and economic consequences of routine laboratory monitoring for efficacy and toxicity of antiretroviral therapy (ART) have rarely been evaluated. METHODS: Cost-effectiveness analysis was conducted in the DART trial (ISRCTN13968779). Adults in Uganda/Zimbabwe starting ART were randomised to clinically-driven monitoring (CDM) or laboratory and clinical monitoring (LCM); individual patient data on healthcare resource utilisation and outcomes were valued with primary economic costs and utilities. Total costs of first/second-line ART, routine 12-weekly CD4 and biochemistry/haematology tests, additional diagnostic investigations, clinic visits, concomitant medications and hospitalisations were considered from the public healthcare sector perspective. A Markov model was used to extrapolate costs and benefits 20 years beyond the trial. RESULTS: 3316 (1660LCM;1656CDM) symptomatic, immunosuppressed ART-naive adults (median (IQR) age 37 (32,42); CD4 86 (31,139) cells/mm(3)) were followed for median 4.9 years. LCM had a mean 0.112 year (41 days) survival benefit at an additional mean cost of 765[95765 [95%CI:685,845], translating into an adjusted incremental cost of 7386 [3277,dominated] per life-year gained and 7793[4442,39179]perquality−adjustedlifeyeargained.Routinetoxicitytestswereprominentcost−driversandhadnobenefit.With12−weeklyCD4monitoringfromyear2onART,low−costsecond−lineART,butwithouttoxicitymonitoring,CD4testcostsneedtofallbelow7793 [4442,39179] per quality-adjusted life year gained. Routine toxicity tests were prominent cost-drivers and had no benefit. With 12-weekly CD4 monitoring from year 2 on ART, low-cost second-line ART, but without toxicity monitoring, CD4 test costs need to fall below 3.78 to become cost-effective (<3xper-capita GDP, following WHO benchmarks). CD4 monitoring at current costs as undertaken in DART was not cost-effective in the long-term. CONCLUSIONS: There is no rationale for routine toxicity monitoring, which did not affect outcomes and was costly. Even though beneficial, there is little justification for routine 12-weekly CD4 monitoring of ART at current test costs in low-income African countries. CD4 monitoring, restricted to the second year on ART onwards, could be cost-effective with lower cost second-line therapy and development of a cheaper, ideally point-of-care, CD4 test

    Virological outcomes of second-line protease inhibitor-based treatment for human immunodeficiency virus type 1 in a high-prevalence rural South African setting: a competing-risks prospective cohort analysis

    Get PDF
    Background. Second-line antiretroviral therapy (ART) based on ritonavir-boosted protease inhibitors (bPIs) represents the only available option after first-line failure for the majority of individuals living with human immunodeficiency virus (HIV) worldwide. Maximizing their effectiveness is imperative. Methods. This cohort study was nested within the French National Agency for AIDS and Viral Hepatitis Research (ANRS) 12249 Treatment as Prevention (TasP) cluster-randomized trial in rural KwaZulu-Natal, South Africa. We prospectively investigated risk factors for virological failure (VF) of bPI-based ART in the combined study arms. VF was defined by a plasma viral load >1000 copies/mL ≥6 months after initiating bPI-based ART. Cumulative incidence of VF was estimated and competing risk regression was used to derive the subdistribution hazard ratio (SHR) of the associations between VF and patient clinical and demographic factors, taking into account death and loss to follow-up. Results. One hundred one participants contributed 178.7 person-years of follow-up. Sixty-five percent were female; the median age was 37.4 years. Second-line ART regimens were based on ritonavir-boosted lopinavir, combined with zidovudine or tenofovir plus lamivudine or emtricitabine. The incidence of VF on second-line ART was 12.9 per 100 person-years (n = 23), and prevalence of VF at censoring was 17.8%. Thirteen of these 23 (56.5%) virologic failures resuppressed after a median of 8.0 months (interquartile range, 2.8-16.8 months) in this setting where viral load monitoring was available. Tuberculosis treatment was associated with VF (SHR, 11.50 [95% confidence interval, 3.92-33.74]; P < .001). Conclusions. Second-line VF was frequent in this setting. Resuppression occurred in more than half of failures, highlighting the value of viral load monitoring of second-line ART. Tuberculosis was associated with VF; therefore, novel approaches to optimize the effectiveness of PI-based ART in high-tuberculosis-burden settings are needed

    The impact of different CD4 monitoring and switching strategies on mortality in HIV-infected African adults on antiretroviral therapy; an application of dynamic marginal structural models

    Get PDF
    In Africa, antiretroviral therapy (ART) is delivered with limited laboratory monitoring, often none. In 2003–2004, investigators in the Development of Antiretroviral Therapy in Africa (DART) Trial randomized persons initiating ART in Uganda and Zimbabwe to either laboratory and clinical monitoring (LCM) or clinically driven monitoring (CDM). CD4 cell counts were measured every 12 weeks in both groups but were only returned to treating clinicians for management in the LCM group. Follow-up continued through 2008. In observational analyses, dynamic marginal structural models on pooled randomized groups were used to estimate survival under different monitoring-frequency and clinical/immunological switching strategies. Assumptions included no direct effect of randomized group on mortality or confounders and no unmeasured confounders which influenced treatment switch and mortality or treatment switch and time-dependent covariates. After 48 weeks of first-line ART, 2,946 individuals contributed 11,351 person-years of follow-up, 625 switches, and 179 deaths. The estimated survival probability after a further 240 weeks for post-48-week switch at the first CD4 cell count less than 100 cells/mm3 or non-Candida World Health Organization stage 4 event (with CD4 count <250) was 0.96 (95% confidence interval (CI): 0.94, 0.97) with 12-weekly CD4 testing, 0.96 (95% CI: 0.95, 0.97) with 24-weekly CD4 testing, 0.95 (95% CI: 0.93, 0.96) with a single CD4 test at 48 weeks (baseline), and 0.92 (95% CI: 0.91, 0.94) with no CD4 testing. Comparing randomized groups by 48-week CD4 count, the mortality risk associated with CDM versus LCM was greater in persons with CD4 counts of <100 (hazard ratio = 2.4, 95% CI: 1.3, 4.3) than in those with CD4 counts of ≥100 (hazard ratio = 1.1, 95% CI: 0.8, 1.7; interaction P = 0.04). These findings support a benefit from identifying patients immunologically failing first-line ART at 48 weeks

    High Rate of HIV Re-suppression After Viral Failure on First Line Antiretroviral Therapy in the Absence of Switch to Second Line.

    Get PDF
    In a randomised comparison of nevirapine or abacavir with zidovudine + lamivudine, routine viral load monitoring was not performed yet 27% of individuals with viral failure at w48 re-suppressed by w96 without switching. This supports WHO recommendations that suspected viral failure should trigger adherence counselling and repeat measurement before considering treatment switch

    Utility of total lymphocyte count as a surrogate marker for CD4 counts in HIV-1 infected children in Kenya

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In resource-limited settings, such as Kenya, access to CD4 testing is limited. Therefore, evaluation of less expensive laboratory diagnostics is urgently needed to diagnose immuno-suppression in children.</p> <p>Objectives</p> <p>To evaluate utility of total lymphocyte count (TLC) as surrogate marker for CD4 count in HIV-infected children.</p> <p>Methods</p> <p>This was a hospital based retrospective study conducted in three HIV clinics in Kisumu and Nairobi in Kenya. TLC, CD4 count and CD4 percent data were abstracted from hospital records of 487 antiretroviral-naïve HIV-infected children aged 1 month - 12 years.</p> <p>Results</p> <p>TLC and CD4 count were positively correlated (r = 0.66, p < 0.001) with highest correlation seen in children with severe immuno-suppression (r = 0.72, p < 0.001) and children >59 months of age (r = 0.68, p < 0.001). Children were considered to have severe immuno-suppression if they met the following WHO set CD4 count thresholds: age below 12 months (CD4 counts < 1500 cells/mm<sup>3</sup>), age 12-35 months (CD4 count < 750 cells/mm3), age 36-59 months (CD4 count < 350 cells/mm<sup>3</sup>, and age above 59 months (CD4 count < 200 cells/mm<sup>3</sup>). WHO recommended TLC threshold values for severe immuno-suppression of 4000, 3000, 2500 and 2000 cells/mm<sup>3 </sup>for age categories <12, 12-35, 36-59 and >59 months had low sensitivity of 25%, 23%, 33% and 62% respectively in predicting severe immuno-suppression using CD4 count as gold standard. Raising TLC thresholds to 7000, 6000, 4500 and 3000 cells/mm<sup>3 </sup>for each of the stated age categories increased sensitivity to 71%, 64%, 56% and 86%, with positive predictive values of 85%, 61%, 37%, 68% respectively but reduced specificity to 73%, 62%, 54% and 68% with negative predictive values of 54%, 65%, 71% and 87% respectively.</p> <p>Conclusion</p> <p>TLC is positively correlated with absolute CD4 count in children but current WHO age-specific thresholds had low sensitivity to identify severely immunosuppressed Kenyan children. Sensitivity and therefore utility of TLC to identify immuno-suppressed children may be improved by raising the TLC cut off levels across the various age categories.</p

    Complication characteristics between young-onset type 2 versus type 1 diabetes in a UK population.

    Get PDF
    BACKGROUND: In the UK, the care of young people with diabetes has focused predominantly on type 1 diabetes (T1D). However, young-onset T2D has become increasingly prevalent. At present, it is unclear which type of diabetes represents the more adverse phenotype to develop complications. This study aims to determine the complication burden and its predictive factors in young-onset T2D compared with T1D. METHODS: A cross-sectional study using a hospital diabetes register to identify patients with young-onset T2D and T1D. Young-onset T2D was defined as age of diagnosis below 40 years. The T1D cohort with a similar age of diagnosis was used as a comparator. Data from the last clinic visit was used for analysis. Clinical characteristics and diabetes complications were evaluated at diabetes durations 20 years. Predictive factors for diabetes complications (age, sex, glycated hemoglobin, creatinine, diabetes duration, hypertension, dyslipidemia, and body mass index >25) were determined by logistic regression analysis. RESULTS: Data were collected on 1287 patients, of which 760 and 527 had T1D and T2D, respectively. In all diabetes durations, the T2D cohort had an older age of onset (p<0.0005) with a higher prevalence of obesity, hypertension, and dyslipidemia (all p<0.0005) while glycemic control was similar in both groups. Cardiovascular disease (p<0.005) and neuropathy (p<0.05) were more prevalent in the young-onset T2D cohort in all diabetes durations. There was no difference in retinopathy. Cardiovascular disease was predominantly due to ischemic heart disease. Stroke and peripheral vascular disease became significantly higher in T2D after 20 years duration. After controlling for traditional risk factors, young-onset T2D was an independent predictor for cardiovascular disease (p<0.005) and neuropathy (p<0.05) but not for retinopathy. CONCLUSIONS: Young-onset T2D is a more aggressive phenotype than T1D to develop diabetes complications, particularly for ischemic heart disease and neuropathy

    Risk factors for virological failure and subtherapeutic antiretroviral drug concentrations in HIV-positive adults treated in rural northwestern Uganda

    Get PDF
    ABSTRACT: BACKGROUND: Little is known about immunovirological treatment outcomes and adherence in HIV/AIDS patients on antiretroviral therapy (ART) treated using a simplified management approach in rural areas of developing countries, or about the main factors influencing those outcomes in clinical practice. METHODS: Cross-sectional immunovirological, pharmacological, and adherence outcomes were evaluated in all patients alive and on fixed-dose ART combinations for 24 months, and in a random sample of those treated for 12 months. Risk factors for virological failure (>1,000 copies/mL) and subtherapeutic antiretroviral (ARV) concentrations were investigated with multiple logistic regression. RESULTS: At 12 and 24 months of ART, 72% (n=701) and 70% (n=369) of patients, respectively, were alive and in care. About 8% and 38% of patients, respectively, were diagnosed with immunological failure; and 75% and 72% of patients, respectively, had undetectable HIV RNA (<400 copies/mL). Risk factors for virological failure (>1,000 copies/mL) were poor adherence, tuberculosis diagnosed after ART initiation, subtherapeutic NNRTI concentrations, general clinical symptoms, and lower weight than at baseline. About 14% of patients had low ARV plasma concentrations. Digestive symptoms and poor adherence to ART were risk factors for low ARV plasma concentrations. CONCLUSIONS: Efforts to improve both access to care and patient management to achieve better immunological and virological outcomes on ART are necessary to maximize the duration of first-line therapy

    Modelling imperfect adherence to HIV induction therapy

    Get PDF
    Abstract Background Induction-maintenance therapy is a treatment regime where patients are prescribed an intense course of treatment for a short period of time (the induction phase), followed by a simplified long-term regimen (maintenance). Since induction therapy has a significantly higher chance of pill fatigue than maintenance therapy, patients might take drug holidays during this period. Without guidance, patients who choose to stop therapy will each be making individual decisions, with no scientific basis. Methods We use mathematical modelling to investigate the effect of imperfect adherence during the inductive phase. We address the following research questions: 1. Can we theoretically determine the maximal length of a possible drug holiday and the minimal number of doses that must subsequently be taken while still avoiding resistance? 2. How many drug holidays can be taken during the induction phase? Results For a 180 day therapeutic program, a patient can take several drug holidays, but then has to follow each drug holiday with a strict, but fairly straightforward, drug-taking regimen. Since the results are dependent upon the drug regimen, we calculated the length and number of drug holidays for all fifteen protease-sparing triple-drug cocktails that have been approved by the US Food and Drug Administration. Conclusions Induction therapy with partial adherence is tolerable, but the outcome depends on the drug cocktail. Our theoretical predictions are in line with recent results from pilot studies of short-cycle treatment interruption strategies and may be useful in guiding the design of future clinical trials
    • …
    corecore