22 research outputs found

    Broca's Area, Sentence Comprehension, and Working Memory: An fMRI Study

    Get PDF
    The role of Broca's area in sentence processing remains controversial. According to one view, Broca's area is involved in processing a subcomponent of syntactic processing. Another view holds that it contributes to sentence processing via verbal working memory. Sub-regions of Broca's area have been identified that are more active during the processing of complex (object-relative clause) sentences compared to simple (subject-relative clause) sentences. The present study aimed to determine if this complexity effect can be accounted for in terms of the articulatory rehearsal component of verbal working memory. In a behavioral experiment, subjects were asked to comprehend sentences during concurrent speech articulation which minimizes articulatory rehearsal as a resource for sentence comprehension. A finger-tapping task was used as a control concurrent task. Only the object-relative clause sentences were more difficult to comprehend during speech articulation than during the manual task, showing that articulatory rehearsal does contribute to sentence processing. A second experiment used fMRI to document the brain regions underlying this effect. Subjects judged the plausibility of sentences during speech articulation, a finger-tapping task, or without a concurrent task. In the absence of a secondary task, Broca's area (pars triangularis and pars opercularis) demonstrated an increase in activity as a function of syntactic complexity. However, during concurrent speech articulation (but not finger-tapping) this complexity effect was eliminated in the pars opercularis suggesting that this region supports sentence comprehension via its role in articulatory rehearsal. Activity in the pars triangularis was modulated by the finger-tapping task, but not the speech articulation task

    An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex.

    Get PDF
    Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex

    An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex

    Get PDF
    Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex

    Group map illustrating regions significantly activated in the Audiovisual > Auditory-Speech Only contrast.

    No full text
    <p>Group activation map (N=18, false discovery rate q <0.05) overlaid on a surface-rendered template brain.</p

    Displays the ROI selected in each subject (N=14) overlaid on a surface-rendered template brain. Voxels were selected using a functional localizer.

    No full text
    <p>Displays the ROI selected in each subject (N=14) overlaid on a surface-rendered template brain. Voxels were selected using a functional localizer.</p

    A representative subject illustrating voxels selected using an anatomically defined ROI.

    No full text
    <p>A representative subject illustrating voxels selected using an anatomically defined ROI.</p
    corecore