Sensory modality dependence of neural networks supporting simple working memory

Abstract

We used functional magnetic resonance imaging (fMRI) and evoked response potentials (ERPs) to examine how the brain subserves cognitive processes when the same tasks and stimuli are presented through different modalities. The stimuli were auditory or visual bandpass-filtered white noise. On a given trial, three stimuli, each with differing center frequencies, were presented in succession. For temporal sequencing tasks, participants indicated when the stimulus with the highest frequency content appeared. For comparison tasks, participants indicated whether the last stimulus' frequency content was lower, intermediate, or higher than the first two stimuli. Task difficulty was equated by establishing 80% accuracy thresholds across subjects and tasks. For the fMRI data, we used task partial least squares (PLS) to identify activation patterns that independently map onto stimulus modality and task demands. Next, we examined PFC functional connections using seed PLS. We found that for each modality, the correlations between the right PFC and the rest of the brain were different in terms of the number of dimensions needed to capture task dependent PFC-brain correlations, and in terms of the voxels that functionally connect with the PFC, suggesting that brain activation patterns associated with modality and task demands interact. We used behavioural PLS to identify neural patterns capturing the optimal association between brain images and reaction time. This analysis also identified a significant interaction between modality and task demands, indicating that task dependent brain-behavior correlations changed with stimulus modality. For the ERP data, we used task PLS to identify the temporal interaction of neural networks that subserve modality and task demands. Source models for the task demand effect in each modality suggest that the spatial distribution of current sources are very similar outside the sensory cortices, but different in terms of time course. Specifically, information for visual processing appears to be updated and held online in a manner that is different from auditory processing, which is done mostly after the offset of the final stimulus. Overall, these results suggest that the neural substrates that support our auditory and visual working memory tasks differ outside the sensory cortices both in terms of activity and interactivity.Ph.D

    Similar works