172 research outputs found

    Primary radiotherapy in progressive optic nerve sheath meningiomas: a long-term follow-up study

    Get PDF
    Background/aims: To report the outcome of primary radiotherapy in patients with progressive optic nerve sheath meningioma (ONSM). Methods: The clinical records of all patients were reviewed in a retrospective, observational, multicentre study. Results: Thirty-four consecutive patients were included. Twenty-six women and eight men received conventional or stereotactic fractionated radiotherapy, and were followed for a median 58 (range 51–156) months. Fourteen eyes (41%) showed improved visual acuity of at least two lines on the Snellen chart. In 17 (50%) eyes, the vision stabilised, while deterioration was noted in three eyes (9%). The visual outcome was not associated with age at the time of radiotherapy (p=0.83), sex (p=0.43), visual acuity at the time of presentation (p=0.22) or type of radiotherapy (p=0.35). Optic disc swelling was associated with improved visual acuity (p<0.01) and 4/11 patients with optic atrophy also showed improvement. Long-term complications were dry eyes in five patients, cataracts in three, and mild radiation retinopathy in four. Conclusion: Primary radiotherapy for patients with ONSM is associated with long-term improvement of visual acuity and few adverse effects.Peerooz Saeed, Leo Blank, Dinesh Selva, John G. Wolbers, Peter J.C.M. Nowak, Ronald B. Geskus, Ezekiel Weis, Maarten P. Mourits, Jack Rootma

    Anal Lymphogranuloma Venereum Infection Screening With IgA Anti-Chlamydia trachomatis-Specific Major Outer Membrane Protein Serology

    Get PDF
    General rights It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: https://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible

    Abundance of Early Functional HIV-Specific CD8+ T Cells Does Not Predict AIDS-Free Survival Time

    Get PDF
    Background T-cell immunity is thought to play an important role in controlling HIV infection, and is a main target for HIV vaccine development. HIV-specific central memory CD8+ and CD4+ T cells producing IFNγ and IL-2 have been associated with control of viremia and are therefore hypothesized to be truly protective and determine subsequent clinical outcome. However, the cause-effect relationship between HIV-specific cellular immunity and disease progression is unknown. We investigated in a large prospective cohort study involving 96 individuals of the Amsterdam Cohort Studies with a known date of seroconversion whether the presence of cytokine-producing HIV-specific CD8+ T cells early in infection was associated with AIDS-free survival time. Methods and Findings The number and percentage of IFNγ and IL-2 producing CD8+ T cells was measured after in vitro stimulation with an overlapping Gag-peptide pool in T cells sampled approximately one year after seroconversion. Kaplan-Meier survival analysis and Cox proportional hazard models showed that frequencies of cytokine-producing Gag-specific CD8+ T cells (IFNγ, IL-2 or both) shortly after seroconversion were neither associated with time to AIDS nor with the rate of CD4+ T-cell decline. Conclusions These data show that high numbers of functional HIV-specific CD8+ T cells can be found early in HIV infection, irrespective of subsequent clinical outcome. The fact that both progressors and long-term non-progressors have abundant T cell immunity of the specificity associated with low viral load shortly after seroconversion suggests that the more rapid loss of T cell immunity observed in progressors may be a consequence rather than a cause of disease progression

    Does rapid HIV disease progression prior to combination antiretroviral therapy hinder optimal CD4 + T-cell recovery once HIV-1 suppression is achieved?

    No full text
    Objective: This article compares trends in CD4+ T-cell recovery and proportions achieving optimal restoration (>=500 cells/µl) after viral suppression following combination antiretroviral therapy (cART) initiation between rapid and nonrapid progressors. Methods: We included HIV-1 seroconverters achieving viral suppression within 6 months of cART. Rapid progressors were individuals experiencing at least one CD4+ less than 200 cells/µl within 12 months of seroconverters before cART. We used piecewise linear mixed models and logistic regression for optimal restoration. Results: Of 4024 individuals, 294 (7.3%) were classified as rapid progressors. At the same CD4+ T-cell count at cART start (baseline), rapid progressors experienced faster CD4+ T-cell increases than nonrapid progressors in first month [difference (95% confidence interval) in mean increase/month (square root scale): 1.82 (1.61; 2.04)], which reversed to slightly slower increases in months 1–18 [-0.05 (-0.06; -0.03)] and no significant differences in 18–60 months [-0.003 (-0.01; 0.01)]. Percentage achieving optimal restoration was significantly lower for rapid progressors than nonrapid progressors at months 12 (29.2 vs. 62.5%) and 36 (47.1 vs. 72.4%) but not at month 60 (70.4 vs. 71.8%). These differences disappeared after adjusting for baseline CD4+ T-cell count: odds ratio (95% confidence interval) 0.86 (0.61; 1.20), 0.90 (0.38; 2.17) and 1.56 (0.55; 4.46) at months 12, 36 and 60, respectively. Conclusion: Among people on suppressive antiretroviral therapy, rapid progressors experience faster initial increases of CD4+ T-cell counts than nonrapid progressors, but are less likely to achieve optimal restoration during the first 36 months after cART, mainly because of lower CD4+ T-cell counts at cART initiation

    How to handle mortality when investigating length of hospital stay and time to clinical stability

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital length of stay (LOS) and time for a patient to reach clinical stability (TCS) have increasingly become important outcomes when investigating ways in which to combat Community Acquired Pneumonia (CAP). Difficulties arise when deciding how to handle in-hospital mortality. Ad-hoc approaches that are commonly used to handle time to event outcomes with mortality can give disparate results and provide conflicting conclusions based on the same data. To ensure compatibility among studies investigating these outcomes, this type of data should be handled in a consistent and appropriate fashion.</p> <p>Methods</p> <p>Using both simulated data and data from the international Community Acquired Pneumonia Organization (CAPO) database, we evaluate two ad-hoc approaches for handling mortality when estimating the probability of hospital discharge and clinical stability: 1) restricting analysis to those patients who lived, and 2) assigning individuals who die the "worst" outcome (right-censoring them at the longest recorded LOS or TCS). Estimated probability distributions based on these approaches are compared with right-censoring the individuals who died at time of death (the complement of the Kaplan-Meier (KM) estimator), and treating death as a competing risk (the cumulative incidence estimator). Tests for differences in probability distributions based on the four methods are also contrasted.</p> <p>Results</p> <p>The two ad-hoc approaches give different estimates of the probability of discharge and clinical stability. Analysis restricted to patients who survived is conceptually problematic, as estimation is conditioned on events that happen <it>at a future time</it>. Estimation based on assigning those patients who died the worst outcome (longest LOS and TCS) coincides with the complement of the KM estimator based on the subdistribution hazard, which has been previously shown to be equivalent to the cumulative incidence estimator. However, in either case the time to in-hospital mortality is ignored, preventing simultaneous assessment of patient mortality in addition to LOS and/or TCS. The power to detect differences in underlying hazards of discharge between patient populations differs for test statistics based on the four approaches, and depends on the underlying hazard ratio of mortality between the patient groups.</p> <p>Conclusions</p> <p>Treating death as a competing risk gives estimators which address the clinical questions of interest, and allows for simultaneous modelling of both in-hospital mortality and TCS / LOS. This article advocates treating mortality as a competing risk when investigating other time related outcomes.</p

    HTLV-1 and HIV-2 Infection Are Associated with Increased Mortality in a Rural West African Community

    Get PDF
    BACKGROUND: Survival of people with HIV-2 and HTLV-1 infection is better than that of HIV-1 infected people, but long-term follow-up data are rare. We compared mortality rates of HIV-1, HIV-2, and HTLV-1 infected subjects with those of retrovirus-uninfected people in a rural community in Guinea-Bissau. METHODS: In 1990, 1997 and 2007, adult residents (aged ≥15 years) were interviewed, a blood sample was drawn and retroviral status was determined. An annual census was used to ascertain the vital status of all subjects. Cox regression analysis was used to estimate mortality hazard ratios (HR), comparing retrovirus-infected versus uninfected people. RESULTS: A total of 5376 subjects were included; 197 with HIV-1, 424 with HIV-2 and 325 with HTLV-1 infection. The median follow-up time was 10.9 years (range 0.0-20.3). The crude mortality rates were 9.6 per 100 person-years of observation (95% confidence interval 7.1-12.9) for HIV-1, 4.1 (3.4-5.0) for HIV-2, 3.6 (2.9-4.6) for HTLV-1, and 1.6 (1.5-1.8) for retrovirus-negative subjects. The HR comparing the mortality rate of infected to that of uninfected subjects varied significantly with age. The adjusted HR for HIV-1 infection varied from 4.0 in the oldest age group (≥60 years) to 12.7 in the youngest (15-29 years). The HR for HIV-2 infection varied from 1.2 (oldest) to 9.1 (youngest), and for HTLV-1 infection from 1.2 (oldest) to 3.8 (youngest). CONCLUSIONS: HTLV-1 infection is associated with significantly increased mortality. The mortality rate of HIV-2 infection, although lower than that of HIV-1 infection, is also increased, especially among young people

    ICE COLD ERIC – International collaborative effort on chronic obstructive lung disease: exacerbation risk index cohorts – Study protocol for an international COPD cohort study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Chronic Obstructive Pulmonary Disease (COPD) is a systemic disease; morbidity and mortality due to COPD are on the increase, and it has great impact on patients' lives. Most COPD patients are managed by general practitioners (GP). Too often, GPs base their initial assessment of patient's disease severity mainly on lung function. However, lung function correlates poorly with COPD-specific health-related quality of life and exacerbation frequency. A validated COPD disease risk index that better represents the clinical manifestations of COPD and is feasible in primary care seems to be useful. The objective of this study is to develop and validate a practical COPD disease risk index that predicts the clinical course of COPD in primary care patients with GOLD stages 2–4.</p> <p>Methods/Design</p> <p>We will conduct 2 linked prospective cohort studies with COPD patients from GPs in Switzerland and the Netherlands. We will perform a baseline assessment including detailed patient history, questionnaires, lung function, history of exacerbations, measurement of exercise capacity and blood sampling. During the follow-up of at least 2 years, we will update the patients' profile by registering exacerbations, health-related quality of life and any changes in the use of medication. The primary outcome will be health-related quality of life. Secondary outcomes will be exacerbation frequency and mortality. Using multivariable regression analysis, we will identify the best combination of variables predicting these outcomes over one and two years and, depending on funding, even more years.</p> <p>Discussion</p> <p>Despite the diversity of clinical manifestations and available treatments, assessment and management today do not reflect the multifaceted character of the disease. This is in contrast to preventive cardiology where, nowadays, the treatment in primary care is based on patient-specific and fairly refined cardiovascular risk profile corresponding to differences in prognosis. After completion of this study, we will have a practical COPD-disease risk index that predicts the clinical course of COPD in primary care patients with GOLD stages 2–4. In a second step we will incorporate evidence-based treatment effects into this model, such that the instrument may guide physicians in selecting treatment based on the individual patients' prognosis.</p> <p>Trial registration</p> <p>ClinicalTrials.gov Archive NCT00706602</p

    Transmission Selects for HIV-1 Strains of Intermediate Virulence: A Modelling Approach

    Get PDF
    Recent data shows that HIV-1 is characterised by variation in viral virulence factors that is heritable between infections, which suggests that viral virulence can be naturally selected at the population level. A trade-off between transmissibility and duration of infection appears to favour viruses of intermediate virulence. We developed a mathematical model to simulate the dynamics of putative viral genotypes that differ in their virulence. As a proxy for virulence, we use set-point viral load (SPVL), which is the steady density of viral particles in blood during asymptomatic infection. Mutation, the dependency of survival and transmissibility on SPVL, and host effects were incorporated into the model. The model was fitted to data to estimate unknown parameters, and was found to fit existing data well. The maximum likelihood estimates of the parameters produced a model in which SPVL converged from any initial conditions to observed values within 100–150 years of first emergence of HIV-1. We estimated the 1) host effect and 2) the extent to which the viral virulence genotype mutates from one infection to the next, and found a trade-off between these two parameters in explaining the variation in SPVL. The model confirms that evolution of virulence towards intermediate levels is sufficiently rapid for it to have happened in the early stages of the HIV epidemic, and confirms that existing viral loads are nearly optimal given the assumed constraints on evolution. The model provides a useful framework under which to examine the future evolution of HIV-1 virulence

    Viral Load Levels Measured at Set-Point Have Risen Over the Last Decade of the HIV Epidemic in the Netherlands

    Get PDF
    HIV-1 RNA plasma concentration at viral set-point is associated not only with disease outcome but also with the transmission dynamics of HIV-1. We investigated whether plasma HIV-1 RNA concentration and CD4 cell count at viral set-point have changed over time in the HIV epidemic in the Netherlands.We selected 906 therapy-naïve patients with at least one plasma HIV-1 RNA concentration measured 9 to 27 months after estimated seroconversion. Changes in HIV-1 RNA and CD4 cell count at viral set-point over time were analysed using linear regression models. The ATHENA national observational cohort contributed all patients who seroconverted in or after 1996; the Amsterdam Cohort Studies (ACS) contributed seroconverters before 1996. The mean of the first HIV-1 RNA concentration measured 9-27 months after seroconversion was 4.30 log(10) copies/ml (95% CI 4.17-4.42) for seroconverters from 1984 through 1995 (n = 163); 4.27 (4.16-4.37) for seroconverters 1996-2002 (n = 232), and 4.59 (4.52-4.66) for seroconverters 2003-2007 (n = 511). Compared to patients seroconverting between 2003-2007, the adjusted mean HIV-1 RNA concentration at set-point was 0.28 log(10) copies/ml (95% CI 0.16-0.40; p<0.0001) and 0.26 (0.11-0.41; p = 0.0006) lower for those seroconverting between 1996-2002 and 1984-1995, respectively. Results were robust regardless of type of HIV-1 RNA assay, HIV-1 subtype, and interval between measurement and seroconversion. CD4 cell count at viral set-point declined over calendar time at approximately 5 cells/mm(3)/year.The HIV-1 RNA plasma concentration at viral set-point has increased over the last decade of the HIV epidemic in the Netherlands. This is accompanied by a decreasing CD4 cell count over the period 1984-2007 and may have implications for both the course of the HIV infection and the epidemic
    corecore