108 research outputs found

    Reduced renal function is associated with progression to AIDS but not with overall mortality in HIV-infected kenyan adults not initially requiring combination antiretroviral therapy

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The World Health Organization (WHO) has recently recommended that antiretrovirals be initiated in all individuals with CD4 counts of less than 350 cells/mm<sup>3</sup>. For countries with resources too limited to expand care to all such patients, it would be of value to able to identify and target populations at highest risk of HIV progression. Renal disease has been identified as a risk factor for disease progression or death in some populations.</p> <p>Methods</p> <p>Times to meeting combination antiretroviral therapy (cART) initiation criteria (developing either a CD4 count < 200 cells/mm<sup>3 </sup>or WHO stage 3 or 4 disease) and overall mortality were evaluated in cART-naĂŻve, HIV-infected Kenyan adults with CD4 cell counts ≄200/mm<sup>3 </sup>and with WHO stage 1 or 2 disease. Cox proportional hazard regression models were used to evaluate the associations between renal function and these endpoints.</p> <p>Results</p> <p>We analyzed data of 7383 subjects with a median follow-up time of 59 (interquartile range, 27-97) weeks. In Cox regression analyses adjusted for age, sex, WHO disease stage, CD4 cell count and haemoglobin, estimated creatinine clearance (CrCl) < 60 mL/min was significantly associated with shorter times to meeting cART initiation criteria (HR 1.34; 95% CI, 1.23-1.52) and overall mortality (HR 1.73; 95% CI, 1.19-2.51) compared with CrCl ≄60 mL/min. Estimated glomerular filtration rate (eGFR) < 60 mL/min/1.73 m<sup>2 </sup>was associated with shorter times to meeting cART initiation criteria (HR 1.39; 95% CI, 1.22-1.58), but not with overall mortality. CrCl and eGFR remained associated with shorter times to cART initiation criteria, but neither was associated with mortality, in weight-adjusted analyses.</p> <p>Conclusions</p> <p>In this large natural history study, reduced renal function was strongly associated with faster HIV disease progression in adult Kenyans not initially meeting cART initiation criteria. As such, renal function measurement in resource-limited settings may be an inexpensive method to identify those most in need of cART to prevent progression to AIDS. The initial association between reduced CrCl, but not reduced eGFR, and greater mortality was explained by the low weights in this population.</p

    Development and Use of a Web-based Data Management System for a Randomized Clinical Trial of Adolescents and Young Adults

    Get PDF
    Recent advances in technology provide support for multi-site, web-based data entry systems and the storage of data in a centralized location, resulting in immediate access to data for investigators, reduced participant burden and human entry error, and improved integrity of clinical trial data. The purpose of this paper is to describe the development of a comprehensive, web-based data management system for a multi-site randomized behavioral intervention trial. Strategies used to create this study-specific data management system included interdisciplinary collaboration, design mapping, feasibility assessments, and input from an advisory board of former patients with characteristics similar to the targeted population. The resulting data management system and development strategies provide a template for other behavioral intervention studies

    Viewpoint: A Pragmatic Approach to Constructing a Minimum Data Set for Care of Patients with HIV in Developing Countries

    Get PDF
    Providing quality health care requires access to continuous patient data that developing countries often lack. A panel of medical informatics specialists, clinical human immunodeficiency virus (HIV) specialists, and program managers suggests a minimum data set for supporting the management and monitoring of patients with HIV and their care programs in developing countries. The proposed minimum data set consists of data for registration and scheduling, monitoring and improving practice management, and describing clinical encounters and clinical care. Data should be numeric or coded using standard definitions and minimal free text. To enhance accuracy, efficiency, and availability, data should be recorded electronically by those generating them. Data elements must be sufficiently detailed to support clinical algorithms/guidelines and aggregation into broader categories for consumption by higher level users (e.g., national and international health care agencies). The proposed minimum data set will evolve over time as funding increases, care protocols change, and additional tests and treatments become available for HIV-infected patients in developing countrie

    Viral suppression among children and their caregivers living with HIV in western Kenya

    Get PDF
    INTRODUCTION: Despite the central role of caregivers in managing HIV treatment for children living with HIV, viral suppression within caregiver-child dyads in which both members are living with HIV is not well described. METHODS: We conducted a retrospective analysis of children living with HIV <15 years of age and their caregivers living with HIV attending HIV clinics affiliated with the Academic Model Providing Access to Healthcare (AMPATH) in Kenya between 2015 and 2017. To be included in the analysis, children and caregivers must have had ≄1 viral load (VL) during the study period while receiving antiretroviral therapy (ART) for ≄6 months, and the date of the caregiver's VL must have occurred ±90 days from the date of the child's VL. The characteristics of children, caregivers and dyads were descriptively summarized. Multivariable logistic regression was used to estimate the odds of viral non-suppression (≄ 1000 copies/mL) in children, adjusting for caregiver and child characteristics. RESULTS: Of 7667 children who received care at AMPATH during the study period, 1698 were linked to a caregiver living with HIV and included as caregiver-child dyads. For caregivers, 94% were mothers, median age at ART initiation 32.8 years, median CD4 count at ART initiation 164 cells/mm3 and 23% were not virally suppressed. For children, 52% were female, median age at ART initiation 4.2 years, median CD4 values at ART initiation were 15% (age < 5 years) and 396 cells/mm3 (age ≄ 5 years), and 38% were not virally suppressed. In the multivariable model, children were found more likely to not be virally suppressed if their caregivers were not suppressed compared to children with suppressed caregivers (aOR = 2.40, 95% CI: 1.86 to 3.10). Other characteristics associated with child viral non-suppression included caregiver ART regimen change prior to the VL, caregiver receipt of a non-NNRTI-based regimen at the time of the VL, younger child age at ART initiation and child tuberculosis treatment at the time of the VL. CONCLUSIONS: Children were at higher risk of viral non-suppression if their caregivers were not virally suppressed compared to children with suppressed caregivers. A child's viral suppression status should be closely monitored if his or her caregiver is not suppressed

    Alternative antiretroviral monitoring strategies for HIV-infected patients in east Africa: opportunities to save more lives?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Updated World Health Organization guidelines have amplified debate about how resource constraints should impact monitoring strategies for HIV-infected persons on combination antiretroviral therapy (cART). We estimated the incremental benefit and cost effectiveness of alternative monitoring strategies for east Africans with known HIV infection.</p> <p>Methods</p> <p>Using a validated HIV computer simulation based on resource-limited data (USAID and AMPATH) and circumstances (east Africa), we compared alternative monitoring strategies for HIV-infected persons newly started on cART. We evaluated clinical, immunologic and virologic monitoring strategies, including combinations and conditional logic (e.g., only perform virologic testing if immunologic testing is positive). We calculated incremental cost-effectiveness ratios (ICER) in units of cost per quality-adjusted life year (QALY), using a societal perspective and a lifetime horizon. Costs were measured in 2008 US dollars, and costs and benefits were discounted at 3%. We compared the ICER of monitoring strategies with those of other resource-constrained decisions, in particular earlier cART initiation (at CD4 counts of 350 cells/mm<sup>3 </sup>rather than 200 cells/mm<sup>3</sup>).</p> <p>Results</p> <p>Monitoring strategies employing routine CD4 testing without virologic testing never maximized health benefits, regardless of budget or societal willingness to pay for additional health benefits. Monitoring strategies employing virologic testing conditional upon particular CD4 results delivered the most benefit at willingness-to-pay levels similar to the cost of earlier cART initiation (approximately 2600/QALY).Monitoringstrategiesemployingroutinevirologictestingaloneonlymaximizedhealthbenefitsatwillingness−to−paylevels(>2600/QALY). Monitoring strategies employing routine virologic testing alone only maximized health benefits at willingness-to-pay levels (> 4400/QALY) that greatly exceeded the ICER of earlier cART initiation.</p> <p>Conclusions</p> <p>CD4 testing alone never maximized health benefits regardless of resource limitations. Programmes routinely performing virologic testing but deferring cART initiation may increase health benefits by reallocating monitoring resources towards earlier cART initiation.</p

    Influencing functional outcomes: a look at role performance and satisfaction with life following liver transplant

    Get PDF
    Abstract 572 The success of orthotopic liver transplantation (OLT), originally measured as survival, now extends to quality of the life saved. Return to work (RTW) is also a desired outcome. Our AIM was to explore the relationship between 5 pre-OLT factors & 5 post-OLT quality of life (QOL) domains with life satisfaction and primary productive role to better understand how to improve both. METHODS: Patients (pts)1-3 yrs post-OLT filled QOL form during follow-up clinic visits between 7/04 to 6/05. The Liver transplantation Database-Quality of life (LTD-QOL) form yielded data on 5 domains: measure of disease (MOD), psychological distress/well-being (PDW), personal function (PF), social/role function (SRF) & general health perception (GHP). Results: 229 pts were first categorized as satisfied overall with life (79%), or dissatisfied, and then assigned to groups based on primary productive role (51%), no primary productive role, or retired. Pre-OLT variables were age, gender, marital status, education, & etiology of liver disease; HCV (33%), alcohol liver disease (ALD)(11%), HCV+ALD (10%), & others (46%). Marital status & age were not significantly related to the outcome variables. Etiology of liver disease, education, and time since OLT and 5 post-OLT QOL domains were significantly associated with both outcome variables; satisfaction and primary productive role (p<.0001).To understand the differences, the 5 physical & men-tal QOL domains were regressed on primary productive role and satisfaction. Pts (mean age 54 yrs (19-74 yrs), males, 70%) fell into the category of primary productive role rates (51%). Pts transplanted for ALD were significantly (p<.05) more likely to be satisfied with life, whereas individuals with HCV±ALD, had lowest satisfaction and were most likely to be unable/uninterested in work. Stepwise logistical regression analysis of satisfaction demonstrated that GHP and SRF correlated most highly. Although satisfaction was significant in bivariate analysis, regression analysis of the influence of domains of QOL, as well as employment, demonstrated that SRF & GHP correlated most highly with life satisfaction. CONCLUSIONS: SRF and GHP correlate with good QOL post OLT. HCV patients have low levels of satisfaction whereas the highest level of satisfaction is in the ALD group. Further studies should address methods to improve satisfaction in those with HCV

    Evaluating the Impact of a HIV Low-Risk Express Care Task-Shifting Program: A Case Study of the Targeted Learning Roadmap

    Get PDF
    In conducting studies on an exposure of interest, a systematic roadmap should be applied for translating causal questions into statistical analyses and interpreting the results. In this paper we describe an application of one such roadmap applied to estimating the joint effect of both time to availability of a nurse-based triage system (low risk express care (LREC)) and individual enrollment in the program among HIV patients in East Africa. Our study population is comprised of 16;513 subjects found eligible for this task-shifting program within 15 clinics in Kenya between 2006 and 2009, with each clinic starting the LREC program between 2007 and 2008. After discretizing followup into 90-day time intervals, we targeted the population mean counterfactual outcome (i.e. counterfactual probability of either dying or being lost to follow up) at up to 450 days after initial LREC eligibility under three fixed treatment interventions. These were (i) under no program availability during the entire follow-up, (ii) under immediate program availability at initial eligibility, but non-enrollment during the entire follow-up, and (iii) under immediate program availability and enrollment at initial eligibility. We further estimated the controlled direct effect of immediate program availability compared to no program availability, under a hypothetical intervention to prevent individualenrollment in the program. Targeted minimum loss-based estimation was used to estimate the mean outcome, while Super Learning was implemented to estimate the required nuisance parameters. Analyses were conducted with the ltmle R package; analysis code is available at an online repository as an R package. Results showed that at 450 days, the probability of in-care survival for subjects with immediate availability and enrollment was 0:93 (95% CI: 0.91, 0.95) and 0:87 (95% CI: 0.86, 0.87) for subjects with immediate availability never enrolling. For subjects without LREC availability, it was 0:91 (95% CI: 0.90, 0.92). Immediate program availability without individualenrollment, compared to no program availability, was estimated to slightly albeit significantly decrease survival by 4% (95% CI 0.03,0.06, p\u3c 0:01). Immediately availability and enrollment resulted in a 7% higher in-care survival compared to immediate availability with non-enrollment after 450 days (95% CI -0.08,-0.05, p\u3c 0:01). The results are consistent with a fairly small impact of both availability and enrollment in the LREC program on in-care survival
    • 

    corecore