60 research outputs found

    Cognitive function and drivers of cognitive impairment in a European and a Korean cohort of people living with HIV

    Get PDF
    Although cognitive impairments are still prevalent in the current antiretroviral therapy era, limited investigations have compared the prevalence of cognitive disorder in people living with HIV (PLWH) and its determinants in different regions and ethnicities. We compared cognitive performance across six domains using comparable batteries in 134 PLWH aged ≥45 years from the COBRA study (Netherlands, UK), and 194 PLWH aged ≥18 years from the NeuroAIDS Project (South Korea). Cognitive scores were standardized and averaged to obtain domain and global T-scores. Associations with global T-scores were evaluated using multivariable regression and the ability of individual tests to detect cognitive impairment (global T-score ≤45) was assessed using the area-under-the-receiver-operating-characteristic curve (AUROC). The median (interquartile range) age of participants was 56 (51, 62) years in COBRA (88% white ethnicity, 93% male) and 45 (37, 52) years in NeuroAIDS (100% Korean ethnicity, 94% male). The rate of cognitive impairment was 18.8% and 18.0%, respectively (p = 0.86). In COBRA, Black-African ethnicity was the factor most strongly associated with cognitive function (11.1 [7.7, 14.5] lower scores vs. white ethnicity, p < 0.01), whereas in NeuroAIDS, age (0.6 [0.1, 1.3] per 10-year, p<0.01) and education (0.7 [0.5, 0.9] per year, p<0.01) were significantly associated with cognitive function with anemia showing only a weak association (−1.2 [−2.6, 0.3], p=0.12). Cognitive domains most associated with cognitive impairment were attention (AUROC = 0.86) and executive function (AUROC = 0.87) in COBRA and processing speed (AUROC = 0.80), motor function (AUROC = 0.78) and language (AUROC = 0.78) in NeuroAIDS. Two cohorts of PLWH from different geographical regions report similar rates of cognitive impairment but different risk factors and cognitive profiles of impairment

    Second Language Processing Shows Increased Native-Like Neural Responses after Months of No Exposure

    Get PDF
    Although learning a second language (L2) as an adult is notoriously difficult, research has shown that adults can indeed attain native language-like brain processing and high proficiency levels. However, it is important to then retain what has been attained, even in the absence of continued exposure to the L2—particularly since periods of minimal or no L2 exposure are common. This event-related potential (ERP) study of an artificial language tested performance and neural processing following a substantial period of no exposure. Adults learned to speak and comprehend the artificial language to high proficiency with either explicit, classroom-like, or implicit, immersion-like training, and then underwent several months of no exposure to the language. Surprisingly, proficiency did not decrease during this delay. Instead, it remained unchanged, and there was an increase in native-like neural processing of syntax, as evidenced by several ERP changes—including earlier, more reliable, and more left-lateralized anterior negativities, and more robust P600s, in response to word-order violations. Moreover, both the explicitly and implicitly trained groups showed increased native-like ERP patterns over the delay, indicating that such changes can hold independently of L2 training type. The results demonstrate that substantial periods with no L2 exposure are not necessarily detrimental. Rather, benefits may ensue from such periods of time even when there is no L2 exposure. Interestingly, both before and after the delay the implicitly trained group showed more native-like processing than the explicitly trained group, indicating that type of training also affects the attainment of native-like processing in the brain. Overall, the findings may be largely explained by a combination of forgetting and consolidation in declarative and procedural memory, on which L2 grammar learning appears to depend. The study has a range of implications, and suggests a research program with potentially important consequences for second language acquisition and related fields

    Validation of a Novel Multivariate Method of Defining HIV-Associated Cognitive Impairment

    Get PDF
    Background. The optimum method of defining cognitive impairment in virally suppressed people living with HIV is unknown. We evaluated the relationships between cognitive impairment, including using a novel multivariate method (NMM), patientreported outcome measures (PROMs), and neuroimaging markers of brain structure across 3 cohorts.Methods. Differences in the prevalence of cognitive impairment, PROMs, and neuroimaging data from the COBRA, CHARTER, and POPPY cohorts (total n = 908) were determined between HIV-positive participants with and without cognitive impairment defined using the HIV-associated neurocognitive disorders (HAND), global deficit score (GDS), and NMM criteria.Results. The prevalence of cognitive impairment varied by up to 27% between methods used to define impairment (eg, 48% for HAND vs 21% for NMM in the CHARTER study). Associations between objective cognitive impairment and subjective cognitive complaints generally were weak. Physical and mental health summary scores (SF-36) were lowest for NMM-defined impairment (P&lt;.05). There were no differences in brain volumes or cortical thickness between participants with and without cognitive impairment defined using the HAND and GDS measures. In contrast, those identified with cognitive impairment by the NMM had reduced mean cortical thickness in both hemispheres (P&lt;.05), as well as smaller brain volumes (P&lt;.01). The associations with measures of white matter microstructure and brain-predicted age generally were weaker.Conclusion. Different methods of defining cognitive impairment identify different people with varying symptomatology and measures of brain injury. Overall, NMM-defined impairment was associated with most neuroimaging abnormalities and poorer selfreported health status. This may be due to the statistical advantage of using a multivariate approach

    Hyperaggregating effect of hydroxyethyl starch components and University of Wisconsin solution on human red blood cells: a risk of impaired graft perfusion in organ procurement?

    No full text
    BACKGROUND: The standard preservation solution used during organ procurement and preservation of most organs is the University of Wisconsin (UW) solution. Despite its superiority over other cold storage solutions, the inclusion of hydroxyethyl starch (HES) as one of the components of the UW solution has been both advocated and denied. This study determined whether HES had any effect on red blood cell (RBC) aggregability and correlated aggregation parameters with HES molecular weight. METHODS: Human RBC aggregability and deformability were investigated in vitro, at 4 degrees C, with a laser-assisted optical rotation cell analyzer. The study of RBC aggregation in a binary HES-HES system gave an indication about the nature of HES-RBCs interactions. Bright field microscopy and atomic force microscopy were used to morphologically characterize the aggregates size and form. RESULTS: High molecular weight HES and UW solution had a potent hyperaggregating effect; low molecular weight HES had a hypoaggregating effect on RBC. RBC aggregates were of large size and their resistance to dissociation by flow-induced shear stress was high. CONCLUSION: The authors' in vitro experiments conclusively showed that the physiologic function of RBCs to form aggregates is significantly affected in the presence of HES. The use of high molecular weight HES in UW solution accounts for extended and accelerated aggregation of erythrocytes that may result in stasis of blood and incomplete washout of donor organs before transplantation
    • …
    corecore