370 research outputs found
Iridotomy to slow progression of visual field loss in angle-closure glaucoma
BACKGROUND: Primary angle-closure glaucoma is a type of glaucoma associated with a physically obstructed anterior chamber angle. Obstruction of the anterior chamber angle blocks drainage of fluids (aqueous humor) within the eye and may raise intraocular pressure (IOP). Elevated IOP is associated with glaucomatous optic nerve damage and visual field loss. Laser peripheral iridotomy (often just called 'iridotomy') is a procedure to eliminate pupillary block by allowing aqueous humor to pass directly from the posterior to anterior chamber through use of a laser to create a hole in the iris. It is commonly used to treat patients with primary angle-closure glaucoma, patients with primary angle closure (narrow angles and no signs of glaucomatous optic neuropathy), and patients who are primary angle-closure suspects (patients with reversible obstruction). The effectiveness of iridotomy on slowing progression of visual field loss, however, is uncertain. OBJECTIVES: To assess the effects of iridotomy compared with no iridotomy for primary angle-closure glaucoma, primary angle closure, and primary angle-closure suspects. SEARCH METHODS: We searched the Cochrane Central Register of Controlled Trials (CENTRAL; 2017, Issue 9) which contains the Cochrane Eyes and Vision Trials Register; MEDLINE Ovid; Embase Ovid; PubMed; LILACS; ClinicalTrials.gov; and the ICTRP. The date of the search was 18 October 2017. SELECTION CRITERIA: Randomized or quasi-randomized controlled trials that compared iridotomy to no iridotomy in primary angle-closure suspects, patients with primary angle closure, or patients with primary angle-closure glaucoma in one or both eyes were eligible. DATA COLLECTION AND ANALYSIS: Two authors worked independently to extract data on study characteristics, outcomes for the review, and risk of bias in the included studies. We resolved differences through discussion. MAIN RESULTS: We identified two trials (2502 eyes of 1251 participants) that compared iridotomy to no iridotomy. Both trials recruited primary angle suspects from Asia and randomized one eye of each participant to iridotomy and the other to no iridotomy. Because the full trial reports are not yet available for both trials, no data are available to assess the effectiveness of iridotomy on slowing progression of visual field loss, change in IOP, need for additional surgeries, number of medications needed to control IOP, mean change in best-corrected visual acuity, and quality of life. Based on currently reported data, one trial showed evidence that iridotomy increases angle width at 18 months (by 12.70°, 95% confidence interval (CI) 12.06° to 13.34°, involving 1550 eyes, moderate-certainty evidence) and may be associated with IOP spikes at one hour after treatment (risk ratio 24.00 (95% CI 7.60 to 75.83), involving 1468 eyes, low-certainty evidence). The risk of bias of the two studies was overall unclear due to lack of availability of a full trial report. AUTHORS' CONCLUSIONS: The available studies that directly compared iridotomy to no iridotomy have not yet published full trial reports. At present, we cannot draw reliable conclusions based on randomized controlled trials as to whether iridotomy slows progression of visual field loss at one year compared to no iridotomy. Full publication of the results from the studies may clarify the benefits of iridotomy
Iridotomy to slow progression of angle-closure glaucoma
The objectives are as follows: The primary objective is to assess the role of iridotomy - compared with observation - in the prevention of visual field loss for individuals who have primary angle closure or primary angle-closure glaucoma in at least one eye. We will also examine the role of iridotomy in the prevention of elevated intraocular pressure (IOP) in individuals with narrow angles (primary angle-closure suspect) in at least one eye
Treatment exhaustion of highly active antiretroviral therapy (HAART) among individuals infected with HIV in the United Kingdon: multicentre cohort study
Objectives:
To investigate whether there is evidence that an increasing proportion of HIV infected patients is starting to experience increases in viral load and decreases in CD4 cell count that are consistent with exhaustion of available treatment options.
Design:
Multicentre cohort study.
Setting:
Six large HIV treatment centres in southeast England.
Participants:
All individuals seen for care between 1 January 1996 and 31 December 2002.
Main outcome measures:
Exposure to individual antiretroviral drugs and drug classes, CD4 count, plasma HIV RNA burden.
Results:
Information is available on 16 593 individuals (13 378 (80.6%) male patients, 10 340 (62.3%) infected via homosexual or bisexual sex, 4426 (26.7%) infected via heterosexual sex, median age 34 years). Overall, 10 207 of the 16 593 patients (61.5%) have been exposed to any antiretroviral therapy. This proportion increased from 41.2% of patients under follow up at the end of 1996 to 71.3% of those under follow up in 2002. The median CD4 count and HIV RNA burden of patients under follow up in each year changed from 270 cells/mm3 and 4.34 log10 copies/ml in 1996 to 408 cells/mm3 and 1.89 log10 copies/ml, respectively, in 2002. By 2002, 3060 (38%) of patients who had ever been treated with antiretroviral therapy had experienced all three main classes. Of these, around one quarter had evidence of “viral load failure” with all these three classes. Patients with three class failure were more likely to have an HIV RNA burden > 2.7 log10 copies/ml and a CD4 count < 200 cells/mm3.
Conclusions:
The proportion of individuals with HIV infection in the United Kingdom who have been treated has increased gradually over time. A substantial proportion of these patients seem to be in danger of exhausting their options for antiretroviral treatment. New drugs with low toxicity, which are not associated with cross resistance to existing drugs, are urgently needed for such patients
Relationship between untimed plasma lopinavir concentrations and virological outcome on second-line antiretroviral therapy
BACKGROUND:Resource constraints in low and middle-income countries necessitate practical approaches to optimizing antiretroviral therapy outcomes. We hypothesised that an untimed plasma lopinavir concentration (UPLC) at week 12 would predict loss of virological response in those taking lopinavir as part of a second-line antiretroviral regimen. METHODS:We measured plasma lopinavir concentration at week 12 on stored samples from the SECOND-LINE study. We characterized UPLC as: detectable and optimal (≥1000 μg/l); detectable but suboptimal (≥25 to < 1000 μg/l); and undetectable (<25 μg/l). We used Cox regression to explore the relationship between UPLC and loss of virological response over 48 weeks and backwards stepwise logistic regression to explore the relationship between UPLC and other predictors of virological failure at week 48. RESULTS:At week 48, we observed virological failure in 15/32 (47%) and 53/485 (11%) of patients with undetectable and detectable UPLC, respectively, P < 0.001. Both suboptimal [adjusted hazard ratio (HR) 2.94; 95% confidence interval (CI) 1.54-5.62; P = 0.001], and undetectable (adjusted HR 3.55; 95% CI 1.89-6.64; P < 0.001) UPLC were associated with higher rates of loss of virological response over 48 weeks. In multivariate analysis, an independent association with virological failure at week 48 and undetectable UPLC was observed after adjustment (odds ratio 5.48; 95% CI 2.23-13.42; P < 0.01). CONCLUSION:In low and middle-income countries implementing a public health approach to antiretroviral therapy treatment, an untimed plasma drug concentration may provide a practical method for early identification of patients with inadequate medication adherence and facilitate timely corrective interventions to prevent virological failure.Gwamaka E. Mwasakifwa, Cecilia Moore, Dianne Carey, Janaki Amin, Paul Penteado, Marcelo Losso, Poh-Lian Lim, Lerato Mohapi, Jean-Michel Molina, Brian Gazzard, David A. Cooper, Mark Boyd, for the SECOND-LINE Study Grou
Low levels of neurocognitive impairment detected in screening HIV-infected men who have sex with men: The MSM Neurocog Study
This study aimed to determine the prevalence of HIV neurocognitive impairment in HIV-infected men who have sex with men aged 18–50 years, using a simple battery of screening tests in routine clinical appointments. Those with suspected abnormalities were referred on for further assessment. The cohort was also followed up over time to look at evolving changes. HIV-infected participants were recruited at three clinical sites in London during from routine clinical visits. They could be clinician or self-referred and did not need to be symptomatic. They completed questionnaires on anxiety, depression, and memory. They were then screened using the Brief Neurocognitive Screen (BNCS) and International HIV Dementia Scale (IHDS). Two hundred and five HIV-infected subjects were recruited. Of these, 59 patients were excluded as having a mood disorder and two patients were excluded due to insufficient data, leaving 144 patients for analysis. One hundred and twenty-four (86.1%) had a normal composite z score (within 1 SD of mean) calculated for their scores on the three component tests of the BNCS. Twenty (13.9%) had an abnormal z score, of which seven (35%) were symptomatic and 13 (65%) asymptomatic. Current employment and previous educational level were significantly associated with BNCS scores. Of those referred onwards for diagnostic testing, only one participant was found to have impairment likely related to HIV infection. We were able to easily screen for mood disorders and cognitive impairment in routine clinical practice. We identified a high level of depression and anxiety in our cohort. Using simple screening tests in clinic and an onward referral process for further testing, we were not able to identify neurocognitive impairment in this cohort at levels consistent with published data
CD4 cell count and the risk of AIDS or death in HIV-Infected adults on combination antiretroviral therapy with a suppressed viral load: a longitudinal cohort study from COHERE.
BACKGROUND: Most adults infected with HIV achieve viral suppression within a year of starting combination antiretroviral therapy (cART). It is important to understand the risk of AIDS events or death for patients with a suppressed viral load.
METHODS AND FINDINGS: Using data from the Collaboration of Observational HIV Epidemiological Research Europe (2010 merger), we assessed the risk of a new AIDS-defining event or death in successfully treated patients. We accumulated episodes of viral suppression for each patient while on cART, each episode beginning with the second of two consecutive plasma viral load measurements 500 copies/µl, the first of two consecutive measurements between 50-500 copies/µl, cART interruption or administrative censoring. We used stratified multivariate Cox models to estimate the association between time updated CD4 cell count and a new AIDS event or death or death alone. 75,336 patients contributed 104,265 suppression episodes and were suppressed while on cART for a median 2.7 years. The mortality rate was 4.8 per 1,000 years of viral suppression. A higher CD4 cell count was always associated with a reduced risk of a new AIDS event or death; with a hazard ratio per 100 cells/µl (95% CI) of: 0.35 (0.30-0.40) for counts <200 cells/µl, 0.81 (0.71-0.92) for counts 200 to <350 cells/µl, 0.74 (0.66-0.83) for counts 350 to <500 cells/µl, and 0.96 (0.92-0.99) for counts ≥500 cells/µl. A higher CD4 cell count became even more beneficial over time for patients with CD4 cell counts <200 cells/µl.
CONCLUSIONS: Despite the low mortality rate, the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression. A higher CD4 cell count was associated with the greatest benefit for patients with a CD4 cell count <200 cells/µl but still some slight benefit for those with a CD4 cell count ≥500 cells/µl
- …