11 research outputs found
Mechanisms underlying selecting objects for action
We assessed the factors which affect the selection of objects for action, focusing on the role of action knowledge and its modulation by distracters. 14 neuropsychological patients and 10 healthy aged-matched controls selected pairs of objects commonly used together among distracters in two contexts: with real objects and with pictures of the same objects presented sequentially on a computer screen. Across both tasks, semantically related distracters led to slower responses and more errors than unrelated distracters and the object actively used for action was selected prior to the object that would be passively held during the action. We identified a sub-group of patients (N=6) whose accuracy was 2SD below the controls performances in the real object task. Interestingly, these impaired patients were more affected by the presence of unrelated distracters during both tasks than intact patients and healthy controls. Note the impaired had lesions to left parietal, right anterior temporal and bilateral pre-motor regions. We conclude that: (1) motor procedures guide object selection for action, (2) semantic knowledge affects action-based selection, (3) impaired action decision is associated with the inability to ignore distracting information and (4) lesions to either the dorsal or ventral visual stream can lead to deficits in making action decisions. Overall, the data indicate that impairments in everyday tasks can be evaluated using a simulated computer task. The implications for rehabilitation are discussed
The efficacy of a task model approach to ADL rehabilitation in stroke apraxia and action disorganisation syndrome:A randomised controlled trial
BACKGROUND: Apraxia and action disorganization syndrome (AADS) after stroke can disrupt activities of daily living (ADL). Occupational therapy has been effective in improving ADL performance, however, inclusion of multiple tasks means it is unclear which therapy elements contribute to improvement. We evaluated the efficacy of a task model approach to ADL rehabilitation, comparing training in making a cup of tea with a stepping training control condition. METHODS: Of the 29 stroke survivors with AADS who participated in this cross-over randomized controlled feasibility trial, 25 were included in analysis [44% females; mean(SD) age = 71.1(7.8) years; years post-stroke = 4.6(3.3)]. Participants attended five 1-hour weekly tea making training sessions in which progress was monitored and feedback given using a computer-based system which implemented a Markov Decision Process (MDP) task model. In a control condition, participants received five 1-hour weekly stepping sessions. RESULTS: Compared to stepping training, tea making training reduced errors across 4 different tea types. The time taken to make a cup of tea was reduced so the improvement in accuracy was not due to a speed-accuracy trade-off. No improvement linked to tea making training was evident in a complex tea preparation task (making two different cups of tea simultaneously), indicating a lack of generalisation in the training. CONCLUSIONS: The clearly specified but flexible training protocol, together with information on the distribution of errors, provide pointers for further refinement of task model approaches to ADL rehabilitation. It is recommended that the approach be tested under errorless learning conditions with more impaired patients in future research. TRIAL REGISTRATION: Retrospectively registered at ClinicalTrials.gov on 5(th) August 2019 [NCT04044911] https://clinicaltrials.gov/ct2/show/NCT04044911?term=Cogwatch&rank=
Altered hippocampal functional connectivity patterns in patients with cognitive impairments following ischaemic stroke: a resting-state fMRI study
Background Ischemic stroke with cognitive impairment is a considerable risk factor for developing dementia. Identifying imaging markers of cognitive impairment following ischemic stroke will help to develop prevention strategies against post-stroke dementia.Methods Here, we investigated the hippocampal functional connectivity (FC) pattern following ischemic stroke, using resting-state fMRI (rs-fMRI). Thirty-three cognitively impaired patients after ischemic stroke and sixteen age-matched controls with no known history of neurological disorder, were recruited for the study. Importantly, no patient had a direct ischaemic insult to hippocampus on examination of brain imaging. Seven subfields of hippocampus were used as a seed region for FC analyses.Results Across all hippocampal subfields, FC with the inferior parietal lobe in patients was reduced as compared with healthy controls. This decreased FC included both supramarginal gyrus and angular gyrus. The FC of hippocampal subfields with cerebellum was increased. Importantly, the degree of the altered FC between hippocampal subfields and IPL was associated with their impaired memory function.Conclusion Our results demonstrated that decreased hippocampal-IPL connectivity was associated with cognitive impairment in patients with ischemic stroke. These findings provide novel insights into the role of hippocampus in cognitive impairment following ischemic stroke
Multicenter European Prevalence Study of Neurocognitive Impairment and Associated Factors in HIV Positive Patients
We conducted a cross-sectional study in 448 HIV positive patients attending five European outpatient clinics to determine prevalence of and factors associated with neurocognitive impairment (NCI) using computerized and pen-and-paper neuropsychological tests. NCI was defined as a normalized Z score â€-1 in at least 2 out of 5 cognitive domains. Participants' mean age was 45.8 years; 84% male; 87% white; 56% university educated; median CD4 count 550 cells/mm(3); 89% on antiretroviral therapy. 156 (35%) participants had NCI, among whom 26 (17%; 5.8% overall) reported a decline in activities of daily living. Prevalence of NCI was lower in those always able to afford basic needs (adjusted prevalence ratio [aPR] 0.71, 95% confidence interval [CI] 0.54-0.94) or with a university education (aPR 0.72, 95% CI 0.54-0.97) and higher in those with severe depressive symptoms (aPR 1.53, 95% CI 1.09-2.14) or a significant comorbid condition (aPR 1.40, 95% CI 1.03-1.90)
Level of agreement between frequently used cardiovascular risk calculators in people living with HIV
Objectives
The aim of the study was to describe agreement between the QRISK2, Framingham and Data Collection on Adverse Events of AntiâHIV Drugs (D:A:D) cardiovascular disease (CVD) risk calculators in a large UK study of people living with HIV (PLWH).
Methods
PLWH enrolled in the Pharmacokinetic and Clinical Observations in People over Fifty (POPPY) study without a prior CVD event were included in this study. QRISK2, Framingham CVD and the full and reduced D:A:D CVD scores were calculated; participants were stratified into âlowâ ( 20%) categories for each. Agreement between scores was assessed using weighted kappas and BlandâAltman plots.
Results
The 730 included participants were predominantly male (636; 87.1%) and of white ethnicity (645; 88.5%), with a median age of 53 [interquartile range (IQR) 49â59] years. The median calculated 10âyear CVD risk was 11.9% (IQR 6.8â18.4%), 8.9% (IQR 4.6â15.0%), 8.5% (IQR 4.8â14.6%) and 6.9% (IQR 4.1â11.1%) when using the Framingham, QRISK2, and full and reduced D:A:D scores, respectively. Agreement between the different scores was generally moderate, with the highest level of agreement being between the Framingham and QRISK2 scores (weighted kappa = 0.65) but with most other kappa coefficients in the 0.50â0.60 range.
Conclusions
Estimates of predicted 10âyear CVD risk obtained with commonly used CVD risk prediction tools demonstrate, in general, only moderate agreement among PLWH in the UK. While further validation with clinical endpoints is required, our findings suggest that care should be taken when interpreting any score alone
Depression, lifestyle factors and cognitive function in people living with HIV and comparable HIV-negative controls
We investigated whether differences in cognitive performance between people living with HIV (PLWH) and comparable HIV-negative people were mediated or moderated by depressive symptoms and lifestyle factors.
METHODS:
A cross-sectional study of 637 'older' PLWH aged â„ 50 years, 340 'younger' PLWH aged < 50 years and 276 demographically matched HIV-negative controls aged â„ 50 years enrolled in the Pharmacokinetic and Clinical Observations in People over Fifty (POPPY) study was performed. Cognitive function was assessed using a computerized battery (CogState). Scores were standardized into Z-scores [mean = 0; standard deviation (SD) = 1] and averaged to obtain a global Z-score. Depressive symptoms were evaluated via the Patient Health Questionnaire (PHQ-9). Differences between the three groups and the effects of depression, sociodemographic factors and lifestyle factors on cognitive performance were evaluated using median regression. All analyses accounted for age, gender, ethnicity and level of education.
RESULTS:
After adjustment for sociodemographic factors, older and younger PLWH had poorer overall cognitive scores than older HIV-negative controls (P < 0.001 and P = 0.006, respectively). Moderate or severe depressive symptoms were more prevalent in both older (27%; P < 0.001) and younger (21%; P < 0.001) PLWH compared with controls (8%). Depressive symptoms (P < 0.001) and use of hashish (P = 0.01) were associated with lower cognitive function; alcohol consumption (P = 0.02) was associated with better cognitive scores. After further adjustment for these factors, the difference between older PLWH and HIV-negative controls was no longer significant (P = 0.08), while that between younger PLWH and older HIV-negative controls remained significant (P = 0.01).
CONCLUSIONS:
Poorer cognitive performances in PLWH compared with HIV-negative individuals were, in part, mediated by the greater prevalence of depressive symptoms and recreational drug use reported by PLWH
Validation of a novel multivariate method of defining HIV-associated cognitive impairment
Background. The optimum method of defining cognitive impairment in virally suppressed people living with HIV is unknown.
We evaluated the relationships between cognitive impairment, including using a novel multivariate method (NMM), patientâ
reported outcome measures (PROMs), and neuroimaging markers of brain structure across 3 cohorts.
Methods. Differences in the prevalence of cognitive impairment, PROMs, and neuroimaging data from the COBRA, CHARTER,
and POPPY cohorts (total n = 908) were determined between HIV-positive participants with and without cognitive impairment defined using the HIV-associated neurocognitive disorders (HAND), global deficit score (GDS), and NMM criteria.
Results. The prevalence of cognitive impairment varied by up to 27% between methods used to define impairment (eg, 48% for HAND
vs 21% for NMM in the CHARTER study). Associations between objective cognitive impairment and subjective cognitive complaints generally were weak. Physical and mental health summary scores (SF-36) were lowest for NMM-defined impairment (P < .05).
There were no differences in brain volumes or cortical thickness between participants with and without cognitive impairment defined using the HAND and GDS measures. In contrast, those identified with cognitive impairment by the NMM had reduced mean
cortical thickness in both hemispheres (P < .05), as well as smaller brain volumes (P < .01). The associations with measures of white
matter microstructure and brain-predicted age generally were weaker.
Conclusion. Different methods of defining cognitive impairment identify different people with varying symptomatology and
measures of brain injury. Overall, NMM-defined impairment was associated with most neuroimaging abnormalities and poorer selfreported health status. This may be due to the statistical advantage of using a multivariate approac
Altered hippocampal functional connectivity patterns in patients with cognitive impairments following ischaemic stroke:a resting-state fMRI study
Recommended from our members
Selecting object pairs for action: is the active object always first?
Perception is linked to action via two routes: a direct route based on affordance information in the environment and an indirect route based on semantic knowledge about objects. The present study explored the factors modulating the recruitment of the two routes, in particular which factors affecting the selection of paired objects. In Experiment 1, we presented real objects among semantically related or unrelated distracters. Participants had to select two objects that can interact. The presence of distracters affected selection times, but not the semantic relations of the objects with the distracters. Furthermore, participants first selected the active object (e.g. teaspoon) with their right hand, followed by the passive object (e.g. mug), often with their left hand. In Experiment 2, we presented pictures of the same objects with no hand grip, congruent or incongruent hand grip. Participants had to decide whether the two objects can interact. Action decisions were faster when the presentation of the active object preceded the presentation of the passive object, and when the grip was congruent. Interestingly, participants were slower when the objects were semantically but not functionally related; this effect increased with congruently gripped objects. Our data showed that action decisions in the presence of strong affordance cues (real objects, pictures of congruently gripped objects) relied on sensory-motor representation, supporting the direct route from perception-to-action that bypasses semantic knowledge. However, in the case of weak affordance cues (pictures), semantic information interfered with action decisions, indicating that semantic knowledge impacts action decisions. The data support the dual-route account from perception-to-action