71 research outputs found
The long term durability of combination antiretroviral therapy in HIV-positive patients across Europe
Despite dramatic improvements in the quantity and quality of life for Human Immunodeficiency Virus (HIV) positive people with the introduction of combination antiretroviral therapy (cART), all-cause mortality rates remain higher than the general population. Furthermore, treatment is a lifelong commitment and a substantial burden on patient life. The aim of this thesis was therefore to assess the long term durability of cART through assessing various clinical, virological and immunological outcomes including mortality in HIV-positive patients across Europe. The analyses were based on data from the EuroSIDA cohort, an observational cohort of more than 16000 HIV-positive patients from Europe, Israel and Argentina. Results showed that HIV-positive patients on a well-tolerated and fully suppressive cART regimen have a small risk of treatment failure occurring over the next 6 months and could therefore be monitored less frequently. In contrast patients who have spent a low percentage of time with a suppressed viral load whilst on cART or who have recently rebounded may require more intensive monitoring after making a treatment switch. In patients who have achieved an initial response and tolerated the first three months of treatment, nevirapine efavirenz and lopinavir based cART regimens all have similar durability based on risk of all-cause discontinuation and development of serious clinical events. Starting cART earlier to reduce the proportion of patients with a low CD4 count may decrease the rate of developing many common non-AIDS related malignancies. Individuals in Eastern Europe had an increased risk of mortality from AIDS related causes in part due to differences in use of effective cART. In conclusion results from this thesis provide evidence that could help improve the long term durability of cART for HIV-positive patients through different measures of healthcare capturing the wide aspects of treatment and outcomes
Frequency of HIV-1 Viral Load Monitoring of Patients Initially Successfully Treated with Combination Antiretroviral Therapy
BACKGROUND: Although considered an essential tool for monitoring the effect of combination antiretroviral treatment (CART), HIV-1 RNA (viral load, VL) testing is greatly influenced by cost and availability of resources. ----- OBJECTIVES: To examine whether HIV infected patients who were initially successfully treated with CART have less frequent monitoring of VL over time and whether CART failure and other HIV-disease and sociodemographic characteristics are associated with less frequent VL testing. ----- METHODS: The study included patients who started CART in the period 1999-2004, were older than 18 years, CART naive, had two consecutive viral load measurements of <400 copies/ml after 5 months of treatment and had continuous CART during the first 15 months. The time between two consecutive visits (days) was the outcome and associated factors were assessed using linear mixed models. ----- RESULTS: We analyzed a total of 128 patients with 1683 visits through December 2009. CART failure was observed in 31 (24%) patients. When adjusted for the follow-up time, the mean interval between two consecutive VL tests taken in patients before CART failure (155.2 days) was almost identical to the interval taken in patients who did not fail CART (155.3 days). On multivariable analysis, we found that the adjusted estimated time between visits was 150.9 days before 2003 and 177.6 in 2008/2009. A longer time between visits was observed in seafarers compared to non-seafarers; the mean difference was 30.7 days (95% CI, 14.0 to 47.4; p<0.001); and in individuals who lived more than 160 kilometers from the HIV treatment center (mean difference, 16 days, p=0.010). ----- CONCLUSIONS: Less frequent monitoring of VL became common in recent years and was not associated with failure. We identified seafarers as a population with special needs for CART monitoring and delivery
Imaging glial activation in patients with post-treatment Lyme disease symptoms: A pilot study using [ <sup>11</sup> C]DPA-713 PET
The pathophysiology of post-treatment Lyme disease syndrome (PTLDS) may be linked to overactive immunity including aberrant activity of the brain's resident immune cells, microglia. Here we used [ 11 C]DPA-713 and positron emission tomography to quantify the 18 kDa translocator protein, a marker of activated microglia or reactive astrocytes, in the brains of patients with post-treatment Lyme disease symptoms of any duration compared to healthy controls. Genotyping for the TSPO rs6971 polymorphism was completed, and individuals with the rare, low affinity binding genotype were excluded. Data from eight brain regions demonstrated higher [ 11 C]DPA-713 binding in 12 patients relative to 19 controls. [ 11 C]DPA-713 PET is a promising tool to study cerebral glial activation in PTLDS and its link to cognitive symptoms
Trade-offs between vegetative growth and acorn production in Quercus lobata during a mast year: the relevance of crop size and hierarchical level within the canopy
The concept of trade-offs between reproduction and other fitness traits is a fundamental principle of life history theory. For many plant species, the cost of sexual reproduction affects vegetative growth in years of high seed production through the allocation of resources to reproduction at different hierarchical levels of canopy organization. We have examined these tradeoffs at the shoot and branch level in an endemic California oak, Quercus lobata, during a mast year. To determine whether acorn production caused a reduction in vegetative growth, we studied trees that were high and low acorn producers, respectively. We observed that in both low and high acorn producers, shoots without acorns located adjacent to reproductive shoots showed reduced vegetative growth but that reduced branch-level growth on acorn-bearing branches occurred only in low acorn producers. The availability of local resources, measured as previous year growth, was the main factor determining acorn biomass. These findings show that the costs of reproduction varied among hierarchical levels, suggesting some degree of physiological autonomy of shoots in terms of acorn production. Costs also differed among trees with different acorn crops, suggesting that trees with large acorn crops had more available resources to allocate for growth and acorn production and to compensate for immediate local costs of seed production. These findings provide new insight into the proximate mechanisms for mast-seeding as a reproductive strategy
Relationship between Reproductive Allocation and Relative Abundance among 32 Species of a Tibetan Alpine Meadow: Effects of Fertilization and Grazing
Background: Understanding the relationship between species traits and species abundance is an important goal in ecology and biodiversity science. Although theoretical studies predict that traits related to performance (e.g. reproductive allocation) are most directly linked to species abundance within a community, empirical investigations have rarely been done. It also remains unclear how environmental factors such as grazing or fertilizer application affect the predicted relationship.
Methodology: We conducted a 3-year field experiment in a Tibetan alpine meadow to assess the relationship between plant reproductive allocation (RA) and species relative abundance (SRA) on control, grazed and fertilized plots. Overall, the studied plant community contained 32 common species.
Principal Findings: At the treatment level, (i) RA was negatively correlated with SRA on control plots and during the first year on fertilized plots. (ii) No negative RA–SRA correlations were observed on grazed plots and during the second and third year on fertilized plots. (iii) Seed size was positively correlated with SRA on control plots. At the plot level, the correlation between SRA and RA were not affected by treatment, year or species composition.
Conclusions/Significance: Our study shows that the performance-related trait RA can negatively affect SRA within communities, which is possibly due to the tradeoffs between clonal growth (for space occupancy) and sexual reproduction. We propose that if different species occupy different positions along these tradeoffs it will contribute to biodiversity maintenance in local communities or even at lager scale
Clinical Predictors of Immune Reconstitution following Combination Antiretroviral Therapy in Patients from the Australian HIV Observational Database
A small but significant number of patients do not achieve CD4 T-cell counts >500 cells/µl despite years of suppressive cART. These patients remain at risk of AIDS and non-AIDS defining illnesses. The aim of this study was to identify clinical factors associated with CD4 T-cell recovery following long-term cART.Patients with the following inclusion criteria were selected from the Australian HIV Observational Database (AHOD): cART as their first regimen initiated at CD4 T-cell count <500 cells/µl, HIV RNA<500 copies/ml after 6 months of cART and sustained for at least 12 months. The Cox proportional hazards model was used to identify determinants associated with time to achieve CD4 T-cell counts >500 cells/µl and >200 cells/µl.501 patients were eligible for inclusion from AHOD (n = 2853). The median (IQR) age and baseline CD4 T-cell counts were 39 (32-47) years and 236 (130-350) cells/µl, respectively. A major strength of this study is the long follow-up duration, median (IQR) = 6.5(3-10) years. Most patients (80%) achieved CD4 T-cell counts >500 cells/µl, but in 8%, this took >5 years. Among the patients who failed to reach a CD4 T-cell count >500 cells/µl, 16% received cART for >10 years. In a multivariate analysis, faster time to achieve a CD4 T-cell count >500 cells/µl was associated with higher baseline CD4 T-cell counts (p<0.001), younger age (p = 0.019) and treatment initiation with a protease inhibitor (PI)-based regimen (vs. non-nucleoside reverse transcriptase inhibitor, NNRTI; p = 0.043). Factors associated with achieving CD4 T-cell counts >200 cells/µl included higher baseline CD4 T-cell count (p<0.001), not having a prior AIDS-defining illness (p = 0.018) and higher baseline HIV RNA (p<0.001).The time taken to achieve a CD4 T-cell count >500 cells/µl despite long-term cART is prolonged in a subset of patients in AHOD. Starting cART early with a PI-based regimen (vs. NNRTI-based regimen) is associated with more rapid recovery of a CD4 T-cell count >500 cells/µl
y CD4 Cell Count and the Risk of AIDS or Death in HIV-Infected Adults on Combination Antiretroviral Therapy with a Suppressed Viral Load: A Longitudinal Cohort Study from COHERE
Background
Most adults infected with HIV achieve viral suppression within a year of starting combination antiretroviral therapy (cART). It is important to understand the risk of AIDS events or death for patients with a suppressed viral load.
Methods and Findings
Using data from the Collaboration of Observational HIV Epidemiological Research Europe (2010 merger), we assessed the risk of a new AIDS-defining event or death in successfully treated patients. We accumulated episodes of viral suppression for each patient while on cART, each episode beginning with the second of two consecutive plasma viral load measurements 500 copies/µl, the first of two consecutive measurements between 50–500 copies/µl, cART interruption or administrative censoring. We used stratified multivariate Cox models to estimate the association between time updated CD4 cell count and a new AIDS event or death or death alone. 75,336 patients contributed 104,265 suppression episodes and were suppressed while on cART for a median 2.7 years. The mortality rate was 4.8 per 1,000 years of viral suppression. A higher CD4 cell count was always associated with a reduced risk of a new AIDS event or death; with a hazard ratio per 100 cells/µl (95% CI) of: 0.35 (0.30–0.40) for counts <200 cells/µl, 0.81 (0.71–0.92) for counts 200 to <350 cells/µl, 0.74 (0.66–0.83) for counts 350 to <500 cells/µl, and 0.96 (0.92–0.99) for counts ≥500 cells/µl. A higher CD4 cell count became even more beneficial over time for patients with CD4 cell counts <200 cells/µl.
Conclusions
Despite the low mortality rate, the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression. A higher CD4 cell count was associated with the greatest benefit for patients with a CD4 cell count <200 cells/µl but still some slight benefit for those with a CD4 cell count ≥500 cells/µl
Comparison of recording of hepatitis B infection in the NSW Perinatal Data Collection with linked hepatitis B notifications
© 2017 Deng et al. Objective: Results of routine maternal antenatal hepatitis B (HBV) screening have been recorded in the New South Wales (NSW) Perinatal Data Collection (PDC) since January 2011. We evaluated the accuracy of this reporting in 2012, the first year that comprehensive data were available, by linking the PDC to HBV notifications. Methods: PDC records of mothers giving birth in 2012 were probabilistically linked to HBV notifications recorded in the NSW Notifiable Conditions Information Management System (NCIMS). Sensitivity and specificity of the PDC record of HBV status were determined using a linked HBV notification from the NCIMS database as the gold standard. Results were also examined according to health service (area health service, hospital level, public or private) and individual factors (maternal age, country of birth, Aboriginality, parity, timing of first antenatal visit). Results: Among 99 510 records of women giving birth in NSW in 2012, positive HBV status was recorded for 0.69% of the women according to the PDC record and 0.90% from linked NCIMS records. The overall sensitivity of the HBV status variable in the PDC data was 65.5% (95% confidence interval [CI] 62.4, 68.7) and positive predictive value was 85.3% (95% CI 82.6, 87.9). In general, the low prevalence of HBV meant we had limited statistical power to assess differences between health service factors and maternal factors; however, sensitivity was significantly lower in PDC data for HBV in Australian-born non-Aboriginal women (37.0%; 95% CI 27.5, 46.7) than in overseas-born women (69.9%; 95% CI 66.6, 73.1; p < 0.001). Conclusions: PDC records of HBV status for women giving birth in 2012 had high specificity but poor sensitivity. Sensitivity varied across area health services and levels of maternal services, and by various maternal factors. Because the results of maternal HBV screening can be used to monitor HBV prevalence in adults, analysis of the PDC records in subsequent years is necessary to track whether sensitivity improves over time
- …