59 research outputs found

    Frequency of HIV-1 Viral Load Monitoring of Patients Initially Successfully Treated with Combination Antiretroviral Therapy

    Get PDF
    BACKGROUND: Although considered an essential tool for monitoring the effect of combination antiretroviral treatment (CART), HIV-1 RNA (viral load, VL) testing is greatly influenced by cost and availability of resources. ----- OBJECTIVES: To examine whether HIV infected patients who were initially successfully treated with CART have less frequent monitoring of VL over time and whether CART failure and other HIV-disease and sociodemographic characteristics are associated with less frequent VL testing. ----- METHODS: The study included patients who started CART in the period 1999-2004, were older than 18 years, CART naive, had two consecutive viral load measurements of <400 copies/ml after 5 months of treatment and had continuous CART during the first 15 months. The time between two consecutive visits (days) was the outcome and associated factors were assessed using linear mixed models. ----- RESULTS: We analyzed a total of 128 patients with 1683 visits through December 2009. CART failure was observed in 31 (24%) patients. When adjusted for the follow-up time, the mean interval between two consecutive VL tests taken in patients before CART failure (155.2 days) was almost identical to the interval taken in patients who did not fail CART (155.3 days). On multivariable analysis, we found that the adjusted estimated time between visits was 150.9 days before 2003 and 177.6 in 2008/2009. A longer time between visits was observed in seafarers compared to non-seafarers; the mean difference was 30.7 days (95% CI, 14.0 to 47.4; p<0.001); and in individuals who lived more than 160 kilometers from the HIV treatment center (mean difference, 16 days, p=0.010). ----- CONCLUSIONS: Less frequent monitoring of VL became common in recent years and was not associated with failure. We identified seafarers as a population with special needs for CART monitoring and delivery

    Increasing the frequency of hand washing by healthcare workers does not lead to commensurate reductions in staphylococcal infection in a hospital ward

    Get PDF
    Hand hygiene is generally considered to be the most important measure that can be applied to prevent the spread of healthcare-associated infection (HAI). Continuous emphasis on this intervention has lead to the widespread opinion that HAI rates can be greatly reduced by increased hand hygiene compliance alone. However, this assumes that the effectiveness of hand hygiene is not constrained by other factors and that improved compliance in excess of a given level, in itself, will result in a commensurate reduction in the incidence of HAI. However, several researchers have found the law of diminishing returns to apply to hand hygiene, with the greatest benefits occurring in the first 20% or so of compliance, and others have demonstrated that poor cohorting of nursing staff profoundly influences the effectiveness of hand hygiene measures. Collectively, these findings raise intriguing questions about the extent to which increasing compliance alone can further reduce rates of HAI. In order to investigate these issues further, we constructed a deterministic Ross-Macdonald model and applied it to a hypothetical general medical ward. In this model the transmission of staphylococcal infection was assumed to occur after contact with the transiently colonized hands of HCWs, who, in turn, acquire contamination only by touching colonized patients. The aim of the study was to evaluate the impact of imperfect hand cleansing on the transmission of staphylococcal infection and to identify, whether there is a limit, above which further hand hygiene compliance is unlikely to be of benefit. The model demonstrated that if transmission is solely via the hands of HCWs, it should, under most circumstances, be possible to prevent outbreaks of staphylococcal infection from occurring at a hand cleansing frequencies <50%, even with imperfect hand hygiene. The analysis also indicated that the relationship between hand cleansing efficacy and frequency is not linear - as efficacy decreases, so the hand cleansing frequency required to ensure R0<1 increases disproportionately. Although our study confirmed hand hygiene to be an effective control measure, it demonstrated that the law of diminishing returns applies, with the greatest benefit derived from the first 20% or so of compliance. Indeed, our analysis suggests that there is little benefit to be accrued from very high levels of hand cleansing and that in most situations compliance >40% should be enough to prevent outbreaks of staphylococcal infection occurring, if transmission is solely via the hands of HCWs. Furthermore we identified a non-linear relationship between hand cleansing efficacy and frequency, suggesting that it is important to maximise the efficacy of the hand cleansing process

    How does healthcare worker hand hygiene behaviour impact upon the transmission of MRSA between patients?: an analysis using a Monte Carlo model

    Get PDF
    BACKGROUND: Good hand hygiene has for many years been considered to be the most important measure that can be applied to prevent the spread of healthcare-associated infection (HAI). Continuous emphasis on this intervention has lead to the widespread opinion that HAI rates can be greatly reduced by increased hand hygiene compliance alone. However, this assumes that the effectiveness of hand hygiene is not constrained by other factors and that improved compliance in excess of a given level, in itself, will result in a commensurate reduction in the incidence of HAI. However, there is evidence that the law of diminishing returns applies to hand hygiene, with the greatest benefits occurring in the first 20% or so of compliance. While this raises intriguing questions about the extent to which increasing compliance alone can further reduce rates of HAI, analysis of this subject has been hampered by a lack of quantifiable data relating to the risk of transmission between patients on wards. METHODS: In order to gain a greater understanding of the transmission of infection between patients via the hands of healthcare workers (HCWs), we constructed a stochastic Monte Carlo model to simulate the spread of methicillin-resistant Staphylococcus aureus (MRSA) between patients. We used the model to calculate the risk of transmission occurring, firstly between two patients in adjacent beds, and then between patients in a four-bedded bay. The aim of the study was to quantify the probability of transmission under a variety of conditions and thus to gain an understanding of the contribution made by the various factors which influence transmission. RESULTS: The study revealed that on a four-bedded bay, the average probability of transmitting an infection by the handborne route is generally low (i.e. in the region 0.002 - 0.013 depending on the hand hygiene behaviour of HCWs and other factors). However, because transmission is strongly influenced by stochastic events, it is the frequency with which 'high-risk events' occur, rather than average probability, that governs whether or not transmission will take place. The study revealed that increased hand hygiene compliance has a dramatic impact on the frequency with which 'high-risk events' occur. As compliance increases, so the rate at which 'high-risk events' occur, rapidly decreases, until a point is reached, beyond which, further hand hygiene is unlikely to yield any greater benefit. CONCLUSION: The findings of the study confirm those of other researchers and suggest that the greatest benefits derived from hand hygiene occur as a result of the first tranche of compliance, with higher levels (>50%) of hand hygiene events yielding only marginal benefits. This suggests that in most situations relatively little benefit is accrued from seeking to achieve very high levels of hand hygiene compliance

    Sampling-Based Approaches to Improve Estimation of Mortality among Patient Dropouts: Experience from a Large PEPFAR-Funded Program in Western Kenya

    Get PDF
    Monitoring and evaluation (M&E) of HIV care and treatment programs is impacted by losses to follow-up (LTFU) in the patient population. The severity of this effect is undeniable but its extent unknown. Tracing all lost patients addresses this but census methods are not feasible in programs involving rapid scale-up of HIV treatment in the developing world. Sampling-based approaches and statistical adjustment are the only scaleable methods permitting accurate estimation of M&E indices.In a large antiretroviral therapy (ART) program in western Kenya, we assessed the impact of LTFU on estimating patient mortality among 8,977 adult clients of whom, 3,624 were LTFU. Overall, dropouts were more likely male (36.8% versus 33.7%; p = 0.003), and younger than non-dropouts (35.3 versus 35.7 years old; p = 0.020), with lower median CD4 count at enrollment (160 versus 189 cells/ml; p<0.001) and WHO stage 3-4 disease (47.5% versus 41.1%; p<0.001). Urban clinic clients were 75.0% of non-dropouts but 70.3% of dropouts (p<0.001). Of the 3,624 dropouts, 1,143 were sought and 621 had their vital status ascertained. Statistical techniques were used to adjust mortality estimates based on information obtained from located LTFU patients. Observed mortality estimates one year after enrollment were 1.7% (95% CI 1.3%-2.0%), revised to 2.8% (2.3%-3.1%) when deaths discovered through outreach were added and adjusted to 9.2% (7.8%-10.6%) and 9.9% (8.4%-11.5%) through statistical modeling depending on the method used. The estimates 12 months after ART initiation were 1.7% (1.3%-2.2%), 3.4% (2.9%-4.0%), 10.5% (8.7%-12.3%) and 10.7% (8.9%-12.6%) respectively. CONCLUSIONS/SIGNIFICANCE ABSTRACT: Assessment of the impact of LTFU is critical in program M&E as estimated mortality based on passive monitoring may underestimate true mortality by up to 80%. This bias can be ameliorated by tracing a sample of dropouts and statistically adjust the mortality estimates to properly evaluate and guide large HIV care and treatment programs

    Virological efficacy and emergence of drug resistance in adults on antiretroviral treatment in rural Tanzania

    Get PDF
    Background Virological response to antiretroviral treatment (ART) in rural Africa is poorly described. We examined virological efficacy and emergence of drug resistance in adults receiving first-line ART for up to 4 years in rural Tanzania. Methods Haydom Lutheran Hospital has provided ART to HIV-infected patients since October 2003. A combination of stavudine or zidovudine with lamivudine and either nevirapine or efavirenz is the standard first-line regimen. Nested in a longitudinal cohort study of patients consecutively starting ART, we carried out a cross-sectional virological efficacy survey between November 2007 and June 2008. HIV viral load was measured in all adults who had completed at least 6 months first-line ART, and genotypic resistance was determined in patients with viral load >1000 copies/mL. Results Virological response was measured in 212 patients, of whom 158 (74.5%) were women, and median age was 35 years (interquartile range [IQR] 29–43). Median follow-up time was 22.3 months (IQR 14.0–29.9). Virological suppression, defined as <400 copies/mL, was observed in 187 patients (88.2%). Overall, prevalence of ≥1 clinically significant resistance mutation was 3.9, 8.4, 16.7 and 12.5% in patients receiving ART for 1, 2, 3 and 4 years, respectively. Among those successfully genotyped, the most frequent mutations were M184I/V (64%), conferring resistance to lamivudine, and K103N (27%), Y181C (27%) and G190A (27%), conferring resistance to non-nucleoside reverse transcriptase inhibitors (NNRTIs), whereas 23% had thymidine analogue mutations (TAMs), associated with cross-resistance to all nucleoside reverse transcriptase inhibitors (NRTIs). Dual-class resistance, i.e. resistance to both NRTIs and NNRTIs, was found in 64%. Conclusion Virological suppression rates were good up to 4 years after initiating ART in a rural Tanzanian hospital. However, drug resistance increased with time, and dual-class resistance was common, raising concerns about exhaustion of future antiretroviral drug options. This study might provide a useful forecast of drug resistance and demand for second-line antiretroviral drugs in rural Africa in the coming years

    Errors in ‘BED’-Derived Estimates of HIV Incidence Will Vary by Place, Time and Age

    Get PDF
    The BED Capture Enzyme Immunoassay, believed to distinguish recent HIV infections, is being used to estimate HIV incidence, although an important property of the test--how specificity changes with time since infection--has not been not measured.We construct hypothetical scenarios for the performance of BED test, consistent with current knowledge, and explore how this could influence errors in BED estimates of incidence using a mathematical model of six African countries. The model is also used to determine the conditions and the sample sizes required for the BED test to reliably detect trends in HIV incidence.If the chance of misclassification by BED increases with time since infection, the overall proportion of individuals misclassified could vary widely between countries, over time, and across age-groups, in a manner determined by the historic course of the epidemic and the age-pattern of incidence. Under some circumstances, changes in BED estimates over time can approximately track actual changes in incidence, but large sample sizes (50,000+) will be required for recorded changes to be statistically significant.The relationship between BED test specificity and time since infection has not been fully measured, but, if it decreases, errors in estimates of incidence could vary by place, time and age-group. This means that post-assay adjustment procedures using parameters from different populations or at different times may not be valid. Further research is urgently needed into the properties of the BED test, and the rate of misclassification in a wide range of populations

    Integration of modeling and simulation into hospital-based decision support systems guiding pediatric pharmacotherapy

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Decision analysis in hospital-based settings is becoming more common place. The application of modeling and simulation approaches has likewise become more prevalent in order to support decision analytics. With respect to clinical decision making at the level of the patient, modeling and simulation approaches have been used to study and forecast treatment options, examine and rate caregiver performance and assign resources (staffing, beds, patient throughput). There us a great need to facilitate pharmacotherapeutic decision making in pediatrics given the often limited data available to guide dosing and manage patient response. We have employed nonlinear mixed effect models and Bayesian forecasting algorithms coupled with data summary and visualization tools to create drug-specific decision support systems that utilize individualized patient data from our electronic medical records systems.</p> <p>Methods</p> <p>Pharmacokinetic and pharmacodynamic nonlinear mixed-effect models of specific drugs are generated based on historical data in relevant pediatric populations or from adults when no pediatric data is available. These models are re-executed with individual patient data allowing for patient-specific guidance via a Bayesian forecasting approach. The models are called and executed in an interactive manner through our web-based dashboard environment which interfaces to the hospital's electronic medical records system.</p> <p>Results</p> <p>The methotrexate dashboard utilizes a two-compartment, population-based, PK mixed-effect model to project patient response to specific dosing events. Projected plasma concentrations are viewable against protocol-specific nomograms to provide dosing guidance for potential rescue therapy with leucovorin. These data are also viewable against common biomarkers used to assess patient safety (e.g., vital signs and plasma creatinine levels). As additional data become available via therapeutic drug monitoring, the model is re-executed and projections are revised.</p> <p>Conclusion</p> <p>The management of pediatric pharmacotherapy can be greatly enhanced via the immediate feedback provided by decision analytics which incorporate the current, best-available knowledge pertaining to dose-exposure and exposure-response relationships, especially for narrow therapeutic agents that are difficult to manage.</p

    Phylogenetic Approach Reveals That Virus Genotype Largely Determines HIV Set-Point Viral Load

    Get PDF
    HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases
    corecore