135 research outputs found
Promising developments in neuropsychological approaches for the detection of preclinical Alzheimer’s disease: a selective review
Recently published guidelines suggest that the most opportune time to treat individuals with Alzheimer’s disease is during the preclinical phase of the disease. This is a phase when individuals are defined as clinically normal but exhibit evidence of amyloidosis, neurodegeneration and subtle cognitive/behavioral decline. While our standard cognitive tests are useful for detecting cognitive decline at the stage of mild cognitive impairment, they were not designed for detecting the subtle cognitive variations associated with this biomarker stage of preclinical Alzheimer’s disease. However, neuropsychologists are attempting to meet this challenge by designing newer cognitive measures and questionnaires derived from translational efforts in neuroimaging, cognitive neuroscience and clinical/experimental neuropsychology. This review is a selective summary of several novel, potentially promising, approaches that are being explored for detecting early cognitive evidence of preclinical Alzheimer’s disease in presymptomatic individuals
Recommended from our members
Conceptual and Measurement Challenges in Research on Cognitive Reserve
Cognitive reserve, broadly conceived, encompasses aspects of brain structure and function that optimize individual performance in the presence of injury or pathology. Reserve is defined as a feature of brain structure and/or function that modifies the relationship between injury or pathology and performance on neuropsychological tasks or clinical outcomes. Reserve is challenging to study for two reasons. The first is: reserve is a hypothetical construct, and direct measures of reserve are not available. Proxy variables and latent variable models are used to attempt to operationalize reserve. The second is: in vivo measures of neuronal pathology are not widely available. It is challenging to develop and test models involving a risk factor (injury or pathology), a moderator (reserve) and an outcome (performance or clinical status) when neither the risk factor nor the moderator are measured directly. We discuss approaches for quantifying reserve with latent variable models, with emphasis on their application in the analysis of data from observational studies. Increasingly latent variable models are used to generate composites of cognitive reserve based on multiple proxies. We review the theoretical and ontological status of latent variable modeling approaches to cognitive reserve, and suggest research strategies for advancing the field
Recommended from our members
The Dissociation between Early and Late Selection in Older Adults
Older adults exhibit a reduced ability to ignore task-irrelevant stimuli; however, it remains to be determined where along the information processing stream the most salient age-associated changes occur. In the current study, ERPs provided an opportunity to determine whether age-related differences in processing task-irrelevant stimuli were uniform across information processing stages or disproportionately affected either early or late selection. ERPs were measured in young and old adults during a color-selective attention task in which participants responded to target letters in a specified color (attend condition) while ignoring letters in a different color (ignore condition). Old participants were matched to two groups of young participants on the basis of neuropsychological test performance: one using age-appropriate norms and the other using test scores not adjusted for age. There were no age-associated differences in the magnitude of early selection (attend–ignore), as indexed by the size of the anterior selection positivity and posterior selection negativity. During late selection, as indexed by P3b amplitude, both groups of young participants generated neural responses to target letters under the attend versus ignore conditions that were highly differentiated. In striking contrast, old participants generated a P3b to target letters with no reliable differences between conditions. Individuals who were slow to initiate early selection appeared to be less successful at executing late selection. Despite relative preservation of the operations of early selection, processing delays may lead older participants to allocate excessive resources to task-irrelevant stimuli during late selection
Recommended from our members
Clinical and Economic Characteristics of Milestones Along the Continuum of Alzheimer's Disease: Transforming Functional Scores into Levels of Dependence
BACKGROUND: Because Alzheimer’s disease (AD) is characterized by a gradual decline, it can be difficult to identify distinct clinical milestones that signal disease advancement. Adapting a functional scale may be a useful way of staging disease progression that is more informative for healthcare systems. Objectives: To adapt functional scale scores into discrete levels of dependence as a way of staging disease progression that is more informative to care providers and stakeholders who rely on the functional impact of diseases to determine access to supportive services and interventions. Design: Analysis of data from the GERAS study. Setting: GERAS is an 18-month prospective, multicenter, naturalistic, observational cohort study reflecting the routine care of patients with AD in France, Germany, and the United Kingdom. Participants: Data were from baseline results of 1497 community-living patients, aged ≥55 years, diagnosed with probable AD and their caregivers. Measurements: We used data from the Alzheimer’s Disease Cooperative Study Activities of Daily Living Inventory (ADCS-ADL) and mapped items onto established categories of functional dependence, validated using clinical and economic measures. Cognitive function, behavioral symptoms, caregiver burden, and cost were assessed. Based on stages of functional dependence described by the Dependence Scale, individual ADCS-ADL items were used to approximate 6 dependence levels. Results: There was a significant relationship between assigned level of dependence derived from the ADCS-ADL score and cognitive severity category. As the assigned level of dependence increased, the associated clinical and economic indicators demonstrated a pattern of greater disease severity. Conclusions: This mapping provides initial support for dependence levels as appropriate interim clinical milestones that characterize the functional deficits associated with AD
Recommended from our members
Cognitive status impacts age-related changes in attention to novel and target events in normal adults.
In this study, the authors investigated the relationship between the cognitive status of normal adults and age-related changes in attention to novel and target events. Old, middle-age, and young subjects, divided into cognitively high and cognitively average performing groups, viewed repetitive standard stimuli, infrequent target stimuli, and unique novel visual stimuli. Subjects controlled viewing duration by a button press that led to the onset of the next stimulus. They also responded to targets by pressing a foot pedal. The amount of time spent looking at different kinds of stimuli served as a measure of visual attention and exploratory activity. Cognitively high performers spent more time viewing novel stimuli than cognitively average performers. The magnitude of the difference between cognitively high and cognitively average performing groups was largest among old subjects. Cognitively average performers had slower and less accurate responses to targets than cognitively high performers. The results provide strong evidence that the link between engagement by novelty and higher cognitive performance increases with age. Moreover, the results support the notion of there being different patterns of normal cognitive aging and the need to identify the factors that influence them
The influence of executive capacity on selective attention and subsequent processing
Recent investigations that suggest selective attention (SA) is dependent on top-down control mechanisms lead to the expectation that individuals with high executive capacity (EC) would exhibit more robust neural indices of SA. This prediction was tested by using event-related potentials (ERPs) to examine differences in markers of information processing across 25 subjects divided into two groups based on high vs. average EC, as defined by neuropsychological test scores. Subjects performed an experimental task requiring SA to a specified color. In contrast to expectation, individuals with high and average EC did not differ in the size of ERP indices of SA: the anterior Selection Positivity (SP) and posterior Selection Negativity (SN). However, there were substantial differences between groups in markers of subsequent processing, including the anterior N2 (a measure of attentional control) and the P3a (an index of the orienting of attention). EC predicted speed of processing at both early and late attentional stages. Individuals with lower EC exhibited prolonged SN, P3a, and P3b latencies. However, the delays in carrying out SA operations did not account for subsequent delays in decision making, or explain excessive orienting and reduced attentional control mechanisms in response to stimuli that should have been ignored. SN latency, P3 latency, and the size of the anterior N2 made independent contributions to the variance of EC. In summary, our findings suggest that current views regarding the relationship between top-down control mechanisms and SA may need refinement
Recommended from our members
Frontal and Parietal Components of a Cerebral Network Mediating Voluntary Attention to Novel Events
Despite the important role that attending to novel events plays in human behavior, there is limited information about the neuroanatomical underpinnings of this vital activity. This study investigated the relative contributions of the frontal and posterior parietal lobes to the differential processing of novel and target stimuli under an experimental condition in which subjects actively directed attention to novel events. Event-related potentials were recorded from well-matched frontal patients, parietal patients, and non-brain-injured subjects who controlled their viewing duration (by button press) of line drawings that included a frequent, repetitive background stimulus, an infrequent target stimulus, and infrequent, novel visual stimuli. Subjects also responded to target stimuli by pressing a foot pedal. Damage to the frontal cortex resulted in a much greater disruption of response to novel stimuli than to designated targets. Frontal patients exhibited a widely distributed, profound reduction of the novelty P3 response and a marked diminution of the viewing duration of novel events. In contrast, damage to posterior parietal lobes was associated with a substantial reduction of both target P3 and novelty P3 amplitude; however, there was less disruption of the processing of novel than of target stimuli. We conclude that two nodes of the neuroanatomical network for responding to and processing novelty are the prefrontal and posterior parietal regions, which participate in the voluntary allocation of attention to novel events. Injury to this network is indexed by reduced novelty P3 amplitude, which is tightly associated with diminished attention to novel stimuli. The prefrontal cortex may serve as the central node in determining the allocation of attentional resources to novel events, whereas the posterior parietal lobe may provide the neural substrate for the dynamic process of updating one's internal model of the environment to take into account a novel event
Recommended from our members
Age-related differences in enhancement and suppression of neural activity underlying selective attention in matched young and old adults
Selective attention reflects the top-down control of sensory processing that is mediated by enhancement or inhibition of neural activity. ERPs were used to investigate age-related differences in neural activity in an experiment examining selective attention to color under Attend and Ignore conditions, as well as under a Neutral condition in which color was task-irrelevant. We sought to determine whether differences in neural activity between old and young adult subjects were due to differences in age rather than executive capacity. Old subjects were matched to two groups of young subjects on the basis of neuropsychological test performance: one using age-appropriate norms and the other using test scores not adjusted for age. We found that old and young subject groups did not differ in the overall modulation of selective attention between Attend and Ignore conditions, as indexed by the size of the anterior Selection Positivity. However, in contrast to either young adult group, old subjects did not exhibit reduced neural activity under the Ignore relative to Neutral condition, but showed enhanced activity under the Attend condition. The onset and peak of the Selection Positivity occurred later for old than young subjects. In summary, older adults execute selective attention less efficiently than matched younger subjects, with slowed processing and failed suppression under Ignore. Increased enhancement under Attend may serve as a compensatory mechanism
Recommended from our members
Increased Responsiveness to Novelty is Associated with Successful Cognitive Aging
The animal literature suggests that exposure to more complex, novel environments promotes neurogenesis and cognitive performance in older animals. Studies in humans indicate that participation in intellectually stimulating activities may serve as a buffer against mental decline and help to sustain cognitive abilities. Here, we show that across old adults, increased responsiveness to novel events (as measured by viewing duration and the size of the P3 event-related potential) is strongly linked to better performance on neuropsychological tests, especially those involving attention/executive functions. Cognitively high performing old adults generate a larger P3 response to visual stimuli than cognitively average performing adults. These results suggest that cognitively high performing adults successfully manage the task by appropriating more resources and that the increased size of their P3 component represents a beneficial compensatory mechanism rather than less efficient processing
Recommended from our members
Intelligence quotient–adjusted memory impairment is associated with abnormal single photon emission computed tomography perfusion
Cognitive reserve among highly intelligent older individuals makes detection of early Alzheimer's disease (AD) difficult. We tested the hypothesis that mild memory impairment determined by IQ-adjusted norms is associated with single photon emission computed tomography (SPECT) perfusion abnormality at baseline and predictive of future decline. Twenty-three subjects with a Clinical Dementia Rating (CDR) score of 0, were reclassified after scores were adjusted for IQ into two groups, 10 as having mild memory impairments for ability (IQ-MI) and 13 as memory-normal (IQ-MN). Subjects underwent cognitive and functional assessments at baseline and annual follow-up for 3 years. Perfusion SPECT was acquired at baseline. At follow-up, the IQ-MI subjects demonstrated decline in memory, visuospatial processing, and phonemic fluency, and 6 of 10 had progressed to a CDR of 0.5, while the IQ-MN subjects did not show decline. The IQ-MI group had significantly lower perfusion than the IQ-MN group in parietal/precuneus, temporal, and opercular frontal regions. In contrast, higher perfusion was observed in IQ-MI compared with IQ-MN in the left medial frontal and rostral anterior cingulate regions. IQ-adjusted memory impairment in individuals with high cognitive reserve is associated with baseline SPECT abnormality in a pattern consistent with prodromal AD and predicts subsequent cognitive and functional decline
- …