34 research outputs found

    Development and Validation of a Composite Programmatic Assessment Tool for HIV Therapy

    Get PDF
    Background We developed and validated a new and simple metric, the Programmatic Compliance Score (PCS), based on the IAS-USA antiretroviral therapy management guidelines for HIV-infected adults, as a predictor of all-cause mortality, at a program-wide level. We hypothesized that non-compliance would be associated with the highest probability of mortality. Methods and Findings 3543 antiretroviral-naive HIV-infected patients aged ≥19 years who initiated antiretroviral therapy between January 1, 2000 and August 31, 2009 in British Columbia (BC), Canada, were followed until August 31, 2010. The PCS is composed by six non-performance indicators based on the IAS-USA guidelines: (1) having <3 CD4 count tests in the first year after starting antiretroviral therapy; (2) having <3 plasma viral load tests in the first year after starting antiretroviral therapy; (3) not having drug resistance testing done prior to starting antiretroviral therapy; (4) starting on a non-recommended antiretroviral therapy regimen; (5) starting therapy with CD4 <200 cells/mm3; and (6) not achieving viral suppression within 6 months since antiretroviral therapy initiation. The sum of these six indicators was used to develop the PCS score - higher score indicates poorer performance. The main outcome was all-cause mortality. Each PCS component was independently associated with mortality. In the mortality analysis, the odds ratio (OR) for PCS ≥4 versus 0 was 22.37 (95% CI 10.46–47.84). Conclusions PCS was strongly associated with all-cause mortality. These results lend independent validation to the IAS-USA treatment guidelines for HIV-infected adults. Further efforts are warranted to enhance the PCS as a means to further improve clinical outcomes. These should be specifically evaluated and targeted at healthcare providers and patients

    A Comparison of Initial Antiretroviral Therapy in the Swiss HIV Cohort Study and the Recommendations of the International AIDS Society-USA

    Get PDF
    BACKGROUND: In order to facilitate and improve the use of antiretroviral therapy (ART), international recommendations are released and updated regularly. We aimed to study if adherence to the recommendations is associated with better treatment outcomes in the Swiss HIV Cohort Study (SHCS). METHODS: Initial ART regimens prescribed to participants between 1998 and 2007 were classified according to IAS-USA recommendations. Baseline characteristics of patients who received regimens in violation with these recommendations (violation ART) were compared to other patients. Multivariable logistic and linear regression analyses were performed to identify associations between violation ART and (i) virological suppression and (ii) CD4 cell count increase, after one year. RESULTS: Between 1998 and 2007, 4189 SHCS participants started 241 different ART regimens. A violation ART was started in 5% of patients. Female patients (adjusted odds ratio aOR 1.83, 95%CI 1.28-2.62), those with a high education level (aOR 1.49, 95%CI 1.07-2.06) or a high CD4 count (aOR 1.53, 95%CI 1.02-2.30) were more likely to receive violation ART. The proportion of patients with an undetectable viral load (<400 copies/mL) after one year was significantly lower with violation ART than with recommended regimens (aOR 0.54, 95% CI 0.37-0.80) whereas CD4 count increase after one year of treatment was similar in both groups. CONCLUSIONS: Although more than 240 different initial regimens were prescribed, violations of the IAS-USA recommendations were uncommon. Patients receiving these regimens were less likely to have an undetectable viral load after one year, which strengthens the validity of these recommendations

    Genotypic tropism testing by massively parallel sequencing: qualitative and quantitative analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Inferring viral tropism from genotype is a fast and inexpensive alternative to phenotypic testing. While being highly predictive when performed on clonal samples, sensitivity of predicting CXCR4-using (X4) variants drops substantially in clinical isolates. This is mainly attributed to minor variants not detected by standard bulk-sequencing. Massively parallel sequencing (MPS) detects single clones thereby being much more sensitive. Using this technology we wanted to improve genotypic prediction of coreceptor usage.</p> <p>Methods</p> <p>Plasma samples from 55 antiretroviral-treated patients tested for coreceptor usage with the Monogram Trofile Assay were sequenced with standard population-based approaches. Fourteen of these samples were selected for further analysis with MPS. Tropism was predicted from each sequence with geno2pheno<sub>[coreceptor]</sub>.</p> <p>Results</p> <p>Prediction based on bulk-sequencing yielded 59.1% sensitivity and 90.9% specificity compared to the trofile assay. With MPS, 7600 reads were generated on average per isolate. Minorities of sequences with high confidence in CXCR4-usage were found in all samples, irrespective of phenotype. When using the default false-positive-rate of geno2pheno<sub>[coreceptor] </sub>(10%), and defining a minority cutoff of 5%, the results were concordant in all but one isolate.</p> <p>Conclusions</p> <p>The combination of MPS and coreceptor usage prediction results in a fast and accurate alternative to phenotypic assays. The detection of X4-viruses in all isolates suggests that coreceptor usage as well as fitness of minorities is important for therapy outcome. The high sensitivity of this technology in combination with a quantitative description of the viral population may allow implementing meaningful cutoffs for predicting response to CCR5-antagonists in the presence of X4-minorities.</p

    Cost-Effectiveness of Strategies to Improve HIV Testing and Receipt of Results: Economic Analysis of a Randomized Controlled Trial

    Get PDF
    The CDC recommends routine voluntary HIV testing of all patients 13-64 years of age. Despite this recommendation, HIV testing rates are low even among those at identifiable risk, and many patients do not return to receive their results. To examine the costs and benefits of strategies to improve HIV testing and receipt of results. Cost-effectiveness analysis based on a Markov model. Acceptance of testing, return rates, and related costs were derived from a randomized trial of 251 patients; long-term costs and health outcomes were derived from the literature. Primary-care patients with unknown HIV status. Comparison of three intervention models for HIV counseling and testing: Model A = traditional HIV counseling and testing; Model B = nurse-initiated routine screening with traditional HIV testing and counseling; Model C = nurse-initiated routine screening with rapid HIV testing and streamlined counseling. Life-years, quality-adjusted life-years (QALYs), costs and incremental cost-effectiveness. Without consideration of the benefit from reduced HIV transmission, Model A resulted in per-patient lifetime discounted costs of 48,650andbenefitsof16.271QALYs.ModelBincreasedlifetimecostsby48,650 and benefits of 16.271 QALYs. Model B increased lifetime costs by 53 and benefits by 0.0013 QALYs (corresponding to 0.48 quality-adjusted life days). Model C cost 66morethanModelAwithanincreaseof0.0018QALYs(0.66qualityadjustedlifedays)andanincrementalcosteffectivenessof66 more than Model A with an increase of 0.0018 QALYs (0.66 quality-adjusted life days) and an incremental cost-effectiveness of 36,390/QALY. When we included the benefit from reduced HIV transmission, Model C cost $10,660/QALY relative to Model A. The cost-effectiveness of Model C was robust in sensitivity analyses. In a primary-care population, nurse-initiated routine screening with rapid HIV testing and streamlined counseling increased rates of testing and receipt of test results and was cost-effective compared with traditional HIV testing strategies

    The Ratio 1660/1690 cm−1 Measured by Infrared Microspectroscopy Is Not Specific of Enzymatic Collagen Cross-Links in Bone Tissue

    Get PDF
    In postmenopausal osteoporosis, an impairment in enzymatic cross-links (ECL) occurs, leading in part to a decline in bone biomechanical properties. Biochemical methods by high performance liquid chromatography (HPLC) are currently used to measure ECL. Another method has been proposed, by Fourier Transform InfraRed Imaging (FTIRI), to measure a mature PYD/immature DHLNL cross-links ratio, using the 1660/1690 cm−1 area ratio in the amide I band. However, in bone, the amide I band composition is complex (collagens, non-collagenous proteins, water vibrations) and the 1660/1690 cm−1 by FTIRI has never been directly correlated with the PYD/DHLNL by HPLC. A study design using lathyritic rats, characterized by a decrease in the formation of ECL due to the inhibition of lysyl oxidase, was used in order to determine the evolution of 1660/1690 cm−1 by FTIR Microspectroscopy in bone tissue and compare to the ECL quantified by HPLC. The actual amount of ECL was quantified by HPLC on cortical bone from control and lathyritic rats. The lathyritic group exhibited a decrease of 78% of pyridinoline content compared to the control group. The 1660/1690 cm−1 area ratio was increased within center bone compared to inner bone, and this was also correlated with an increase in both mineral maturity and mineralization index. However, no difference in the 1660/1690 cm−1 ratio was found between control and lathyritic rats. Those results were confirmed by principal component analysis performed on multispectral infrared images. In bovine bone, in which PYD was physically destructed by UV-photolysis, the PYD/DHLNL (measured by HPLC) was strongly decreased, whereas the 1660/1690 cm−1 was unmodified. In conclusion, the 1660/1690 cm−1 is not related to the PYD/DHLNL ratio, but increased with age of bone mineral, suggesting that a modification of this ratio could be mainly due to a modification of the collagen secondary structure related to the mineralization process

    Assessing the Performance of a Computer-Based Policy Model of HIV and AIDS

    Get PDF
    BACKGROUND. Model-based analyses, conducted within a decision analytic framework, provide a systematic way to combine information about the natural history of disease and effectiveness of clinical management strategies with demographic and epidemiological characteristics of the population. Among the challenges with disease-specific modeling include the need to identify influential assumptions and to assess the face validity and internal consistency of the model. METHODS AND FINDINGS. We describe a series of exercises involved in adapting a computer-based simulation model of HIV disease to the Women's Interagency HIV Study (WIHS) cohort and assess model performance as we re-parameterized the model to address policy questions in the U.S. relevant to HIV-infected women using data from the WIHS. Empiric calibration targets included 24-month survival curves stratified by treatment status and CD4 cell count. The most influential assumptions in untreated women included chronic HIV-associated mortality following an opportunistic infection, and in treated women, the 'clinical effectiveness' of HAART and the ability of HAART to prevent HIV complications independent of virologic suppression. Good-fitting parameter sets required reductions in the clinical effectiveness of 1st and 2nd line HAART and improvements in 3rd and 4th line regimens. Projected rates of treatment regimen switching using the calibrated cohort-specific model closely approximated independent analyses published using data from the WIHS. CONCLUSIONS. The model demonstrated good internal consistency and face validity, and supported cohort heterogeneities that have been reported in the literature. Iterative assessment of model performance can provide information about the relative influence of uncertain assumptions and provide insight into heterogeneities within and between cohorts. Description of calibration exercises can enhance the transparency of disease-specific models.National Institute of Allergy and Infectious Diseases (R37 AI042006, K24 AI062476

    Modelling imperfect adherence to HIV induction therapy

    Get PDF
    Abstract Background Induction-maintenance therapy is a treatment regime where patients are prescribed an intense course of treatment for a short period of time (the induction phase), followed by a simplified long-term regimen (maintenance). Since induction therapy has a significantly higher chance of pill fatigue than maintenance therapy, patients might take drug holidays during this period. Without guidance, patients who choose to stop therapy will each be making individual decisions, with no scientific basis. Methods We use mathematical modelling to investigate the effect of imperfect adherence during the inductive phase. We address the following research questions: 1. Can we theoretically determine the maximal length of a possible drug holiday and the minimal number of doses that must subsequently be taken while still avoiding resistance? 2. How many drug holidays can be taken during the induction phase? Results For a 180 day therapeutic program, a patient can take several drug holidays, but then has to follow each drug holiday with a strict, but fairly straightforward, drug-taking regimen. Since the results are dependent upon the drug regimen, we calculated the length and number of drug holidays for all fifteen protease-sparing triple-drug cocktails that have been approved by the US Food and Drug Administration. Conclusions Induction therapy with partial adherence is tolerable, but the outcome depends on the drug cocktail. Our theoretical predictions are in line with recent results from pilot studies of short-cycle treatment interruption strategies and may be useful in guiding the design of future clinical trials

    Readiness in HIV Treatment Adherence: A Matter of Confidence. An Exploratory Study§

    Get PDF
    Adherence to treatment is recognized as the essence of a successful HIV combination therapy. Optimal adherence implies a readiness to begin the treatment on the part of the patient. A better understanding of the "readiness phenomenon" will become an asset for optimizing HIV treatment. However, few studies have focused on understanding the process underlying the choice to adhere. The aim of this study is to understand the readiness process that leads to adhering to the HIV treatment, from both patient and professional perspectives. Twenty-seven in-depth interviews, with a qualitative exploratory design, were the source of our data. Participants were recruited in two hospitals in Paris. Throughout the data-collection process, analysed data were supplied to all participants and the research team, thus allowing for shared constructions. Four themes, interrelated with a constitutive pattern, emerged from the data we collected. Being ready to begin and adhere to treatment is a matter of confidence in oneself, as well as in relatives, in the treatment and in the health professional team. These themes are not constant and unvarying; instead, they constitute a picture moving across time and life events. Results of this study show that adherence that goes beyond “complying with” the medical instructions, but depends on how much of an active role the patient plays in the choice to adhere

    Cost-Effectiveness of Genotypic Antiretroviral Resistance Testing in HIV-Infected Patients with Treatment Failure

    Get PDF
    BACKGROUND: Genotypic antiretroviral resistance testing (GRT) in HIV infection with drug resistant virus is recommended to optimize antiretroviral therapy, in particular in patients with virological failure. We estimated the clinical effect, cost and cost-effectiveness of using GRT as compared to expert opinion in patients with antiretroviral treatment failure. METHODS: We developed a mathematical model of HIV disease to describe disease progression in HIV-infected patients with treatment failure and compared the incremental impact of GRT versus expert opinion to guide antiretroviral therapy. The analysis was conducted from the health care (discount rate 4%) and societal (discount rate 2%) perspective. Outcome measures included life-expectancy, quality-adjusted life-expectancy, health care costs, productivity costs and cost-effectiveness in US Dollars per quality-adjusted life-year (QALY) gained. Clinical and economic data were extracted from the large Swiss HIV Cohort Study and clinical trials. RESULTS: Patients whose treatment was optimized with GRT versus expert opinion had an increase in discounted life-expectancy and quality-adjusted life-expectancy of three and two weeks, respectively. Health care costs with and without GRT were US421,000andUS 421,000 and US 419,000, leading to an incremental cost-effectiveness ratio of US35,000perQALYgained.Intheanalysisfromthesocietalperspective,GRTversusexpertopinionledtoanincreaseindiscountedlifeexpectancyandqualityadjustedlifeexpectancyofthreeandfourweeks,respectively.HealthcarecostswithandwithoutGRTwereUS 35,000 per QALY gained. In the analysis from the societal perspective, GRT versus expert opinion led to an increase in discounted life-expectancy and quality-adjusted life-expectancy of three and four weeks, respectively. Health care costs with and without GRT were US 551,000 and $US 549,000, respectively. When productivity changes were included in the analysis, GRT was cost-saving. CONCLUSIONS: GRT for treatment optimization in HIV-infected patients with treatment failure is a cost-effective use of scarce health care resources and beneficial to the society at large
    corecore