123,495 research outputs found
Neural Mechanisms of Selective Auditory Attention in Rats (Dissertation)
How does attention modulate sensory representations? In order to probe the underlying neural mechanisms, we established a simple rodent model of modality-specific attention. Rats were trained to perform distinct auditory two-tone discrimination and olfactory odor discrimination in a two alternative choice (2AC) paradigm. 
To determine auditory cortex’s role in this frequency discrimination task, we used GABA-A receptor agonist muscimol to transiently and reversibly inactivate auditory cortexes bilaterally in rats performing simple interleaved auditory and olfactory discrimination. With olfactory discrimination performance serving as internal control for motivation and decision making capability, we found only auditory two-tone discrimination was selectively impaired in these rats. This shows the auditory cortex is involved in this two-tone discrimination task.
To investigate the neural correlate of modality-specific attention in the auditory cortex, we trained rats to perform interleaved auditory and olfactory blocks (of 50~70 trials each) in a single session. In auditory blocks, pure tones were either presented with or without a neutral odor (caproic acid, n=2 and 3 respectively), and subjects were rewarded for discriminating auditory stimuli. In olfactory blocks, both task odors and pure tones were presented simultaneously, and subjects were rewarded for discriminating olfactory stimuli. We recorded neural responses in primary auditory cortex (area A1) in freely moving rats while subjects performed this behavior. Single unit responses to tones were heterogeneous, and included transient, sustained, and suppressed. We found 205 of 802 units recorded responsive to the stimuli we used. Of these 205 units, 18.5% showed modality-specific attentional modulation of the anticipatory activity before tone onset. In addition, we also observed in smaller proportion of units (11.2%) modality-specific attentional modulation of the tone-evoked responses; in most cases, the responses to a particular auditory stimulus was enhanced in the auditory block (or, equivalently, suppressed in the olfactory block). Attention increased choice probability of the population in the auditory block. We have also observed significant behavior choice probability in small proportions of units. 
Our results suggest that shifting attention between audition to olfaction tasks can modulate the activity of single neurons in primary auditory cortex
Relationships between human auditory cortical structure and function
The human auditory cortex comprises multiple areas, largely distributed across the supratemporal plane, but the precise number and configuration of auditory areas and their functional significance have not yet been clearly established. In this paper, we discuss recent research concerning architectonic and functional organisation within the human auditory cortex, as well as architectonic and neurophysiological studies in non-human species, which can provide a broad conceptual framework for interpreting functional specialisation in humans. We review the pattern in human auditory cortex of the functional responses to various acoustic cues, such as frequency, pitch, sound level, temporal variation, motion and spatial location, and we discuss their correspondence to what is known about the organisation of the auditory cortex in other primates. There is some neuroimaging evidence of multiple tonotopically organised fields in humans and of functional specialisations of the fields in the processing of different sound features. It is thought that the primary area, on Heschl's gyrus, may have a larger involvement in processing basic sound features, such as frequency and level, and that posterior non-primary areas on the planum temporale may play a larger role in processing more spectrotemporally complex sounds. Ways in which current knowledge of auditory cortical organisation and different data analysis approaches may benefit future functional neuroimaging studies which seek to link auditory cortical structure and function are discussed
Audiovisual temporal correspondence modulates human multisensory superior temporal sulcus plus primary sensory cortices
The brain should integrate related but not unrelated information from different senses. Temporal patterning of inputs to different modalities may provide critical information about whether those inputs are related or not. We studied effects of temporal correspondence between auditory and visual streams on human brain activity with functional magnetic resonance imaging ( fMRI). Streams of visual flashes with irregularly jittered, arrhythmic timing could appear on right or left, with or without a stream of auditory tones that coincided perfectly when present ( highly unlikely by chance), were noncoincident with vision ( different erratic, arrhythmic pattern with same temporal statistics), or an auditory stream appeared alone. fMRI revealed blood oxygenation level-dependent ( BOLD) increases in multisensory superior temporal sulcus (mSTS), contralateral to a visual stream when coincident with an auditory stream, and BOLD decreases for noncoincidence relative to unisensory baselines. Contralateral primary visual cortex and auditory cortex were also affected by audiovisual temporal correspondence or noncorrespondence, as confirmed in individuals. Connectivity analyses indicated enhanced influence from mSTS on primary sensory areas, rather than vice versa, during audiovisual correspondence. Temporal correspondence between auditory and visual streams affects a network of both multisensory ( mSTS) and sensory-specific areas in humans, including even primary visual and auditory cortex, with stronger responses for corresponding and thus related audiovisual inputs
Contextual modulation of primary visual cortex by auditory signals
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame.
This article is part of the themed issue ‘Auditory and visual scene analysis’
Auditory perception modulated by word reading
Theories of embodied cognition positing that sensorimotor areas are indispensable during language comprehension are supported by neuroimaging and behavioural studies. Among others, the auditory system has been suggested to be important for understanding sound-related words (visually presented) and the motor system for action-related words. In this behavioural study, using a sound detection task embedded in a lexical decision task, we show that in participants with high lexical decision performance sound verbs improve auditory perception. The amount of modulation was correlated with lexical decision performance. Our study provides convergent behavioural evidence of auditory cortex involvement in word processing, supporting the view of embodied language comprehension concerning the auditory domain
Heschl's gyrus is more sensitive to tone level than non-primary auditory cortex
Previous neuroimaging studies generally demonstrate a growth in the cortical response with an increase in sound level. However, the details of the shape and topographic location of such growth remain largely unknown. One limiting methodological factor has been the relatively sparse sampling of sound intensities. Additionally, most studies have either analysed the entire auditory cortex without differentiating primary and non-primary regions or have limited their analyses to Heschl's gyrus (HG). Here, we characterise the pattern of responses to a 300-Hz tone presented in 6-dB steps from 42 to 96 dB sound pressure level as a function of its sound level, within three anatomically defined auditory areas; the primary area, on HG, and two non-primary areas, consisting of a small area lateral to the axis of HG (the anterior lateral area, ALA) and the posterior part of auditory cortex (the planum temporale, PT). Extent and magnitude of auditory activation increased non-linearly with sound level. In HG, the extent and magnitude were more sensitive to increasing level than in ALA and PT. Thus, HG appears to have a larger involvement in sound-level processing than does ALA or PT
The state of tranquility: Subjective perception is shaped by contextual modulation of auditory connectivity
In this study, we investigated brain mechanisms for the generation of subjective experience from objective sensory inputs. Our experimental construct was subjective tranquility. Tranquility is a mental state more likely to occur in the presence of objective sensory inputs that arise from natural features in the environment. We used functional magnetic resonance imaging to examine the neural response to scenes that were visually distinct (beach images vs. freeway images) and experienced as tranquil (beach) or non-tranquil (freeway). Both sets of scenes had the same auditory component because waves breaking on a beach and vehicles moving on a freeway can produce similar auditory spectral and temporal characteristics, perceived as a constant roar. Compared with scenes experienced as non-tranquil, we found that subjectively tranquil scenes were associated with significantly greater effective connectivity between the auditory cortex and medial prefrontal cortex, a region implicated in the evaluation of mental states. Similarly enhanced connectivity was also observed between the auditory cortex and posterior cingulate gyrus, temporoparietal cortex and thalamus. These findings demonstrate that visual context can modulate connectivity of the auditory cortex with regions implicated in the generation of subjective states. Importantly, this effect arises under conditions of identical auditory input. Hence, the same sound may be associated with different percepts reflecting varying connectivity between the auditory cortex and other brain regions. This suggests that subjective experience is more closely linked to the connectivity state of the auditory cortex than to its basic sensory inputs
Anatomical pathways for auditory memory II: information from rostral superior temporal gyrus to dorsolateral temporal pole and medial temporal cortex
Auditory recognition memory in non-human primates differs from recognition memory in other sensory systems. Monkeys learn the rule for visual and tactile delayed matching-to-sample within a few sessions, and then show one-trial recognition memory lasting 10–20 min. In contrast, monkeys require hundreds of sessions to master the rule for auditory recognition, and then show retention lasting no longer than 30–40 s. Moreover, unlike the severe effects of rhinal lesions on visual memory, such lesions have no effect on the monkeys' auditory memory performance. The anatomical pathways for auditory memory may differ from those in vision. Long-term visual recognition memory requires anatomical connections from the visual association area TE with areas 35 and 36 of the perirhinal cortex (PRC). We examined whether there is a similar anatomical route for auditory processing, or that poor auditory recognition memory may reflect the lack of such a pathway. Our hypothesis is that an auditory pathway for recognition memory originates in the higher order processing areas of the rostral superior temporal gyrus (rSTG), and then connects via the dorsolateral temporal pole to access the rhinal cortex of the medial temporal lobe. To test this, we placed retrograde (3% FB and 2% DY) and anterograde (10% BDA 10,000 mW) tracer injections in rSTG and the dorsolateral area 38DL of the temporal pole. Results showed that area 38DL receives dense projections from auditory association areas Ts1, TAa, TPO of the rSTG, from the rostral parabelt and, to a lesser extent, from areas Ts2-3 and PGa. In turn, area 38DL projects densely to area 35 of PRC, entorhinal cortex (EC), and to areas TH/TF of the posterior parahippocampal cortex. Significantly, this projection avoids most of area 36r/c of PRC. This anatomical arrangement may contribute to our understanding of the poor auditory memory of rhesus monkeys
- …
