26 research outputs found

    Non-verbal sound processing in the primary progressive aphasias

    Get PDF
    Little is known about the processing of non-verbal sounds in the primary progressive aphasias. Here, we investigated the processing of complex non-verbal sounds in detail, in a consecutive series of 20 patients with primary progressive aphasia [12 with progressive non-fluent aphasia; eight with semantic dementia]. We designed a novel experimental neuropsychological battery to probe complex sound processing at early perceptual, apperceptive and semantic levels, using within-modality response procedures that minimized other cognitive demands and matching tests in the visual modality. Patients with primary progressive aphasia had deficits of non-verbal sound analysis compared with healthy age-matched individuals. Deficits of auditory early perceptual analysis were more common in progressive non-fluent aphasia, deficits of apperceptive processing occurred in both progressive non-fluent aphasia and semantic dementia, and deficits of semantic processing also occurred in both syndromes, but were relatively modality specific in progressive non-fluent aphasia and part of a more severe generic semantic deficit in semantic dementia. Patients with progressive non-fluent aphasia were more likely to show severe auditory than visual deficits as compared to patients with semantic dementia. These findings argue for the existence of core disorders of complex non-verbal sound perception and recognition in primary progressive aphasia and specific disorders at perceptual and semantic levels of cortical auditory processing in progressive non-fluent aphasia and semantic dementia, respectively

    Task and spatial frequency modulations of object processing: an EEG study.

    Get PDF
    Visual object processing may follow a coarse-to-fine sequence imposed by fast processing of low spatial frequencies (LSF) and slow processing of high spatial frequencies (HSF). Objects can be categorized at varying levels of specificity: the superordinate (e.g. animal), the basic (e.g. dog), or the subordinate (e.g. Border Collie). We tested whether superordinate and more specific categorization depend on different spatial frequency ranges, and whether any such dependencies might be revealed by or influence signals recorded using EEG. We used event-related potentials (ERPs) and time-frequency (TF) analysis to examine the time course of object processing while participants performed either a grammatical gender-classification task (which generally forces basic-level categorization) or a living/non-living judgement (superordinate categorization) on everyday, real-life objects. Objects were filtered to contain only HSF or LSF. We found a greater positivity and greater negativity for HSF than for LSF pictures in the P1 and N1 respectively, but no effects of task on either component. A later, fronto-central negativity (N350) was more negative in the gender-classification task than the superordinate categorization task, which may indicate that this component relates to semantic or syntactic processing. We found no significant effects of task or spatial frequency on evoked or total gamma band responses. Our results demonstrate early differences in processing of HSF and LSF content that were not modulated by categorization task, with later responses reflecting such higher-level cognitive factors

    The Role of Gamma-Band Activity in the Representation of Faces: Reduced Activity in the Fusiform Face Area in Congenital Prosopagnosia

    Get PDF
    Congenital prosopagnosia (CP) describes an impairment in face processing that is presumably present from birth. The neuronal correlates of this dysfunction are still under debate. In the current paper, we investigate high-frequent oscillatory activity in response to faces in persons with CP. Such neuronal activity is thought to reflect higher-level representations for faces.Source localization of induced Gamma-Band Responses (iGBR) measured by magnetoencephalography (MEG) was used to establish the origin of oscillatory activity in response to famous and unknown faces which were presented in upright and inverted orientation. Persons suffering from congenital prosopagnosia (CP) were compared to matched controls.Corroborating earlier research, both groups revealed amplified iGBR in response to upright compared to inverted faces predominately in a time interval between 170 and 330 ms and in a frequency range from 50-100 Hz. Oscillatory activity upon known faces was smaller in comparison to unknown faces, suggesting a "sharpening" effect reflecting more efficient processing for familiar stimuli. These effects were seen in a wide cortical network encompassing temporal and parietal areas involved in the disambiguation of homogenous stimuli such as faces, and in the retrieval of semantic information. Importantly, participants suffering from CP displayed a strongly reduced iGBR in the left fusiform area compared to control participants.In sum, these data stress the crucial role of oscillatory activity for face representation and demonstrate the involvement of a distributed occipito-temporo-parietal network in generating iGBR. This study also provides the first evidence that persons suffering from an agnosia actually display reduced gamma band activity. Finally, the results argue strongly against the view that oscillatory activity is a mere epiphenomenon brought fourth by rapid eye-movements (micro saccades)

    Common cortical responses evoked by appearance, disappearance and change of the human face

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To segregate luminance-related, face-related and non-specific components involved in spatio-temporal dynamics of cortical activations to a face stimulus, we recorded cortical responses to face appearance (Onset), disappearance (Offset), and change (Change) using magnetoencephalography.</p> <p>Results</p> <p>Activity in and around the primary visual cortex (V1/V2) showed luminance-dependent behavior. Any of the three events evoked activity in the middle occipital gyrus (MOG) at 150 ms and temporo-parietal junction (TPJ) at 250 ms after the onset of each event. Onset and Change activated the fusiform gyrus (FG), while Offset did not. This FG activation showed a triphasic waveform, consistent with results of intracranial recordings in humans.</p> <p>Conclusion</p> <p>Analysis employed in this study successfully segregated four different elements involved in the spatio-temporal dynamics of cortical activations in response to a face stimulus. The results show the responses of MOG and TPJ to be associated with non-specific processes, such as the detection of abrupt changes or exogenous attention. Activity in FG corresponds to a face-specific response recorded by intracranial studies, and that in V1/V2 is related to a change in luminance.</p

    Association and not semantic relationships elicit the N400 effect: Electrophysiological evidence from an explicit language comprehension task

    Get PDF
    Language comprehension studies have identified the N400, an event-related potential (ERP) correlate of the processing ofmeaning, modulation of which is typically assumed to reflect the activation of semantic information. However, N400 studies of conscious language processing have not clearly distinguished between meaning derived from a semantic relationship and meaning extracted through association. We independently manipulated the presence of associative and semantic relationships while examining the N400 effect. Participants were asked to read and remember visually presented word pairs that shared an association (traffic–jam), an association+semantic relationship (lemon–orange), a semantic relationship alone (cereal–bread), or were unrelated (beard–tower). Modulation of the N400 (relative to unrelated word pairs) was observed for association and association+semantic word pairs but not for those that only shared a semantic relationship

    Large-scale network representations of semantics in the mental lexicon

    No full text
    corecore