21 research outputs found

    Increasing Clinical Virulence in Two Decades of the Italian HIV Epidemic

    Get PDF
    The recent origin and great evolutionary potential of HIV imply that the virulence of the virus might still be changing, which could greatly affect the future of the pandemic. However, previous studies of time trends of HIV virulence have yielded conflicting results. Here we used an established methodology to assess time trends in the severity (virulence) of untreated HIV infections in a large Italian cohort. We characterized clinical virulence by the decline slope of the CD4 count (n = 1423 patients) and the viral setpoint (n = 785 patients) in untreated patients with sufficient data points. We used linear regression models to detect correlations between the date of diagnosis (ranging 1984–2006) and the virulence markers, controlling for gender, exposure category, age, and CD4 count at entry. The decline slope of the CD4 count and the viral setpoint displayed highly significant correlation with the date of diagnosis pointing in the direction of increasing virulence. A detailed analysis of riskgroups revealed that the epidemics of intravenous drug users started with an apparently less virulent virus, but experienced the strongest trend towards steeper CD4 decline among the major exposure categories. While our study did not allow us to exclude the effect of potential time trends in host factors, our findings are consistent with the hypothesis of increasing HIV virulence. Importantly, the use of an established methodology allowed for a comparison with earlier results, which confirmed that genuine differences exist in the time trends of HIV virulence between different epidemics. We thus conclude that there is not a single global trend of HIV virulence, and results obtained in one epidemic cannot be extrapolated to others. Comparison of discordant patterns between riskgroups and epidemics hints at a converging trend, which might indicate that an optimal level of virulence might exist for the virus

    Monitoring the early signs of cognitive decline in elderly by computer games: an MRI study

    Get PDF
    BACKGROUND: It is anticipated that current and future preventive therapies will likely be more effective in the early stages of dementia, when everyday functioning is not affected. Accordingly the early identification of people at risk is particularly important. In most cases, when subjects visit an expert and are examined using neuropsychological tests, the disease has already been developed. Contrary to this cognitive games are played by healthy, well functioning elderly people, subjects who should be monitored for early signs. Further advantages of cognitive games are their accessibility and their cost-effectiveness. PURPOSE: The aim of the investigation was to show that computer games can help to identify those who are at risk. In order to validate games analysis was completed which measured the correlations between results of the 'Find the Pairs' memory game and the volumes of the temporal brain regions previously found to be good predictors of later cognitive decline. PARTICIPANTS AND METHODS: 34 healthy elderly subjects were enrolled in the study. The volume of the cerebral structures was measured by MRI. Cortical reconstruction and volumetric segmentation were performed by Freesurfer. RESULTS: There was a correlation between the number of attempts and the time required to complete the memory game and the volume of the entorhinal cortex, the temporal pole, and the hippocampus. There was also a correlation between the results of the Paired Associates Learning (PAL) test and the memory game. CONCLUSIONS: The results gathered support the initial hypothesis that healthy elderly subjects achieving lower scores in the memory game have increased level of atrophy in the temporal brain structures and showed a decreased performance in the PAL test. Based on these results it can be concluded that memory games may be useful in early screening for cognitive decline
    corecore