4,538 research outputs found

    Flexible recruitment of cortical networks in visual and auditory attention

    Full text link
    Our senses, while limited, shape our perception of the world and contribute to the functional architecture of the brain. This dissertation investigates the role of sensory modality and task demands in the cortical organization of healthy human adults using functional magnetic resonance imaging (fMRI). This research provides evidence for sensory modality bias in frontal cortical regions by directly contrasting auditory and visual sustained attention. This contrast revealed two distinct visual-biased regions in lateral frontal cortex - superior and inferior precentral sulcus (sPCS, iPCS) - anatomically interleaved with two auditory-biased regions - transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic (resting-state) functional connectivity analysis demonstrated that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Unisensory (auditory or visual) short-term memory (STM) tasks assessed the flexible recruitment of these sensory-biased cortical regions by varying information domain demands (e.g., spatial, temporal). While both modalities provide spatial and temporal information, vision has greater spatial resolution than audition, and audition has excellent temporal precision relative to vision. A visual temporal, but not a spatial, STM task flexibly recruited frontal auditory-biased regions; conversely, an auditory spatial task more strongly recruited frontal visual-biased regions compared to an auditory temporal task. This flexible recruitment extended to an auditory-biased superior temporal lobe region and to a subset of visual-biased parietal regions. A demanding auditory spatial STM task recruited anterior/superior visuotopic maps (IPS2-4, SPL1) along the intraparietal sulcus, but neither spatial nor temporal auditory tasks recruited posterior/interior maps. Finally, a comparison of visual spatial attention and STM under varied cognitive load demands attempted to further elucidate the organization of posterior parietal cortex. Parietal visuotopic maps were recruited for both visual spatial attention and working memory but demonstrated a graded response to task demands. Posterior/inferior maps (IPS0-1) demonstrated a linear relationship with the number of items attended to or remembered in the visual spatial tasks. Anterior/superior maps (IPS2-4, SPL1) demonstrated a general recruitment in visual spatial cognitive tasks, with a stronger response for visual spatial attention compared to STM

    Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans

    Get PDF
    Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception

    Large-Scale Dynamics in the Mouse Neocortex underlying Sensory Discrimination and Short-Term Memory

    Full text link
    Sensorimotor integration (SMI) is a fundamental process that allows for an advantageous interaction with the environment, in which key external stimuli are transformed into apt action. In mammals, SMI requires quick and synchronized activity across sensory, association and motor brain areas of the neocortex. In some situations, the key stimulus and its corresponding action are separated by a delay. In such scenario, behaviour-relevant information must be held in short-term memory (STM) until a cue signals the adequate context to transform it into action. This thesis aims to uncover key determinants of brain activity that underlie SMI with a STM component. The first chapter offers a general introduction to the work presented in this thesis. To understand the principles of SMI, I will follow an evolutionary approach. I will explain how during SMI information flows across sensory and association areas. I will also introduce STM and the neocortical areas involved in delay activity. Next, I will emphasize the different sensing and behavioural strategies that animals use to extract action-guiding information from the world. Finally, I will propose different behavioural paradigms to study SMI and STM. Once this foundation is laid, I will introduce the methodological approach of this thesis, in particular genetically-encoded calcium indicators and wide-field imaging. I will end this chapter by stating the specific aims of this thesis. The second chapter is a published manuscript in which I contributed during the first two years of my doctoral thesis work. We studied large-scale dynamics in mice trained to solve a tactile discrimination task with a STM component. We found that mice follow an active and/or passive strategy to solve this task, defined by the presence or absence of whole body movements during tactile stimulation. The movement strategy influenced ongoing brain activity, with higher and more widespread activity in active versus passive trials. Surprisingly, this influence continued into the STM period even in the absence of movements. Active trials elicited activity during the delay period in frontomedial secondary motor cortex. In contrast, passive trials were linked with activity in posterior lateral association areas (PLA). We found these areas to be necessary for task completion in a strategy-dependent manner

    Functional MRI investigations of cortical mechanisms of auditory spatial attention

    Full text link
    In everyday settings, spatial attention helps listeners isolate and understand individual sound sources. However, the neural mechanisms of auditory spatial attention (ASpA) are only partially understood. This thesis uses within-subject analysis of functional magnetic resonance imaging (fMRI) data to address fundamental questions regarding cortical mechanisms supporting ASpA by applying novel multi-voxel pattern analysis (MVPA) and resting-state functional connectivity (rsFC) approaches. A series of fMRI studies of ASpA were conducted in which subjects performed a one-back task in which they attended to one of two spatially separated streams. Attention modulated blood oxygenation level-dependent (BOLD) activity in multiple areas in the prefrontal, temporal, and parietal cortex, including non-visuotopic intraparietal sulcus (IPS), but not the visuotopic maps in IPS. No spatial bias was detected in any cortical area using standard univariate analysis; however, MVPA revealed that activation patterns in a number of areas, including the auditory cortex, predicted the attended direction. Furthermore, we explored how cognitive task demands and the sensory modality of the inputs influenced activity with a visual one-back task and a visual multiple object tracking (MOT) task. Activity from the visual and auditory one-back tasks overlapped along the fundus of IPS and lateral prefrontal cortex (lPFC). However, there was minimal overlap of activity in the lPFC between the visual MOT task and the two one-back tasks. Finally, we endeavored to identify visual and auditory networks using rsFC. We identified a dorsal visual attention network reliably within individual subjects using visuotopic seeds. Using auditory seeds, we found a prefrontal area nested between segments of the dorsal visual attention network. These findings mark fundamental progress towards elucidating the cortical network controlling ASpA. Our results suggest that similar lPFC structures support both ASpA and its visual counterpart during a spatial one-back task, but that ASpA does not drive visuotopic IPS in the parietal cortex. Furthermore, rsFC reveals that visual and auditory seed regions are functionally connected with non-overlapping lPFC regions, possibly reflecting spatial and temporal cognitive processing biases, respectively. While we find no evidence for a spatiotopic map, the auditory cortex is sensitive to direction of attention in its patterns of activation

    Neural oscillatory signatures of auditory and audiovisual illusions

    Get PDF
    Questions of the relationship between human perception and brain activity can be approached from different perspectives: in the first, the brain is mainly regarded as a recipient and processor of sensory data. The corresponding research objective is to establish mappings of neural activity patterns and external stimuli. Alternatively, the brain can be regarded as a self-organized dynamical system, whose constantly changing state affects how incoming sensory signals are processed and perceived. The research reported in this thesis can chiefly be located in the second framework, and investigates the relationship between oscillatory brain activity and the perception of ambiguous stimuli. Oscillations are here considered as a mechanism for the formation of transient neural assemblies, which allows efficient information transfer. While the relevance of activity in distinct frequency bands for auditory and audiovisual perception is well established, different functional architectures of sensory integration can be derived from the literature. This dissertation therefore aims to further clarify the role of oscillatory activity in the integration of sensory signals towards unified perceptual objects, using illusion paradigms as tools of study. In study 1, we investigate the role of low frequency power modulations and phase alignment in auditory object formation. We provide evidence that auditory restoration is associated with a power reduction, while the registration of an additional object is reflected by an increase in phase locking. In study 2, we analyze oscillatory power as a predictor of auditory influence on visual perception in the sound-induced flash illusion. We find that increased beta-/ gamma-band power over occipitotemporal electrodes shortly before stimulus onset predicts the illusion, suggesting a facilitation of processing in polymodal circuits. In study 3, we address the question of whether visual influence on auditory perception in the ventriloquist illusion is reflected in primary sensory or higher-order areas. We establish an association between reduced theta-band power in mediofrontal areas and the occurrence of illusion, which indicates a top-down influence on sensory decision-making. These findings broaden our understanding of the functional relevance of neural oscillations by showing that different processing modes, which are reflected in specific spatiotemporal activity patterns, operate in different instances of sensory integration.Fragen nach dem Zusammenhang zwischen menschlicher Wahrnehmung und Hirnaktivität können aus verschiedenen Perspektiven adressiert werden: in der einen wird das Gehirn hauptsächlich als Empfänger und Verarbeiter von sensorischen Daten angesehen. Das entsprechende Forschungsziel wäre eine Zuordnung von neuronalen Aktivitätsmustern zu externen Reizen. Dieser Sichtweise gegenüber steht ein Ansatz, der das Gehirn als selbstorganisiertes dynamisches System begreift, dessen sich ständig verändernder Zustand die Verarbeitung und Wahrnehmung von sensorischen Signalen beeinflusst. Die Arbeiten, die in dieser Dissertation zusammengefasst sind, können vor allem in der zweitgenannten Forschungsrichtung verortet werden, und untersuchen den Zusammenhang zwischen oszillatorischer Hirnaktivität und der Wahrnehmung von mehrdeutigen Stimuli. Oszillationen werden hier als ein Mechanismus für die Formation von transienten neuronalen Zusammenschlüssen angesehen, der effizienten Informationstransfer ermöglicht. Obwohl die Relevanz von Aktivität in verschiedenen Frequenzbändern für auditorische und audiovisuelle Wahrnehmung gut belegt ist, können verschiedene funktionelle Architekturen der sensorischen Integration aus der Literatur abgeleitet werden. Das Ziel dieser Dissertation ist deshalb eine Präzisierung der Rolle oszillatorischer Aktivität bei der Integration von sensorischen Signalen zu einheitlichen Wahrnehmungsobjekten mittels der Nutzung von Illusionsparadigmen. In der ersten Studie untersuchen wir die Rolle von Leistung und Phasenanpassung in niedrigen Frequenzbändern bei der Formation von auditorischen Objekten. Wir zeigen, dass die Wiederherstellung von Tönen mit einer Reduktion der Leistung zusammenhängt, während die Registrierung eines zusätzlichen Objekts durch einen erhöhten Phasenangleich widergespiegelt wird. In der zweiten Studie analysieren wir oszillatorische Leistung als Prädiktor von auditorischem Einfluss auf visuelle Wahrnehmung in der sound-induced flash illusion. Wir stellen fest, dass erhöhte Beta-/Gamma-Band Leistung über occipitotemporalen Elektroden kurz vor der Reizdarbietung das Auftreten der Illusion vorhersagt, was auf eine Begünstigung der Verarbeitung in polymodalen Arealen hinweist. In der dritten Studie widmen wir uns der Frage, ob ein visueller Einfluss auf auditorische Wahrnehmung in der ventriloquist illusion sich in primären sensorischen oder übergeordneten Arealen widerspiegelt. Wir weisen einen Zusammenhang von reduzierter Theta-Band Leistung in mediofrontalen Arealen und dem Auftreten der Illusion nach, was einen top-down Einfluss auf sensorische Entscheidungsprozesse anzeigt. Diese Befunde erweitern unser Verständnis der funktionellen Bedeutung neuronaler Oszillationen, indem sie aufzeigen, dass verschiedene Verarbeitungsmodi, die sich in spezifischen räumlich-zeitlichen Aktivitätsmustern spiegeln, in verschiedenen Phänomenen von sensorischer Integration wirksam sind

    Neural Substrates of Reliability-Weighted Visual-Tactile Multisensory Integration

    Get PDF
    As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting

    Fractionating the anterior temporal lobe : MVPA reveals differential responses to input and conceptual modality

    Get PDF
    Words activate cortical regions in accordance with their modality of presentation (i.e., written vs. spoken), yet there is a long-standing debate about whether patterns of activity in any specific brain region capture modality-invariant conceptual information. Deficits in patients with semantic dementia highlight the anterior temporal lobe (ATL) as an amodal store of semantic knowledge but these studies do not permit precise localisation of this function. The current investigation used multiple imaging methods in healthy participants to examine functional dissociations within ATL. Multi-voxel pattern analysis identified spatially segregated regions: a response to input modality in anterior superior temporal gyrus (aSTG) and a response to meaning in more ventral anterior temporal lobe (vATL). This functional dissociation was supported by resting-state connectivity that found greater coupling for aSTG with primary auditory cortex and vATL with the default mode network. A meta-analytic decoding of these connectivity patterns implicated aSTG in processes closely tied to auditory processing (such as phonology and language) and vATL in meaning-based tasks (such as comprehension or social cognition). Thus we provide converging evidence for the segregation of meaning and input modality in the ATL

    Functional imaging of response selection

    Get PDF
    The functions of the prefrontal cortex remain controversial. Electrophysio- logical and lesion studies in monkeys have emphasised a role in working memory. In contrast, human functional neuroimaging studies and neuropsychology have emphasised a role in executive processes and volition. An alternative interpretation of the role of the prefrontal cortex is proposed in this thesis: that the prefrontal cortex mediates the attentional selection of sensory, mnemonic and motor representations in non-prefrontal cortex. This hypothesis is tested in a series of functional imaging experiments. In the first two experiments (chapters 4 and 5), event-related functional magnetic resonance imaging (fMRI) was used to re-examine the role of the prefrontal cortex in spatial and spatio-temporal working memory. Maintenance of information in memory was associated with activation of posterior prefrontal cortex (area 8). In contrast, the selection of an item from several remembered items was associated with activation of the middle and anterior parts of the prefrontal cortex (including area 46). To test the generalisation of 'selection' as a function of prefrontal cortex, experiment three (chapter 6) required subjects to select either a finger to move, or a colour from a multicolour display. Free selection was associated with activation of the prefrontal cortex (area 46) bilaterally, regardless of sensory or motor modality. The selection of voluntary actions has been proposed to depend on top-down modulation of motor regions by prefrontal cortex. The fourth and fifth experiments used structural equation modelling of fMRI time -series to measure the effective connectivity among prefrontal, premotor and parietal cortex. In young (chapter 7) and old (chapter 8) normal subjects, attention to action specifically enhanced coupling between prefrontal and premotor regions. This effect was not seen in patients with Parkinson's disease (chapter 8). Lastly, positron emission tomography was used to study planning in the Tower of London task, a common clinical measure of prefrontal function. Several variants of the task were developed, to distinguish the neural basis of the task's multiple cognitive components (chapter 9). The prefrontal cortex was activated in association with generation, selection or memory for moves, rather than planning towards a specified goal. The results support a generalised role in attentional selection of neuronal representations, whether stimuli, actions, or remembered items. The hypothesised attentional selection of responses is consistent with the activation of prefrontal cortex in working memory tasks and during attention to voluntary action. This role is compatible with the neurophysiological properties of individual neurons in the prefrontal cortex and the results of neuroimaging and lesion studies
    • …
    corecore