4,388 research outputs found

    Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex

    Get PDF
    The integration of auditory and visual stimuli is crucial for recognizing objects, communicating effectively, and navigating through our complex world. Although the frontal lobes are involved in memory, communication, and language, there has been no evidence that the integration of communication information occurs at the single-cell level in the frontal lobes. Here, we show that neurons in the macaque ventrolateral prefrontal cortex (VLPFC) integrate audiovisual communication stimuli. The multisensory interactions included both enhancement and suppression of a predominantly auditory or a predominantly visual response, although multisensory suppression was the more common mode of response. The multisensory neurons were distributed across the VLPFC and within previously identified unimodal auditory and visual regions (O’Scalaidhe et al., 1997; Romanski and Goldman-Rakic, 2002). Thus, our study demonstrates, for the first time, that single prefrontal neurons integrate communication information from the auditory and visual domains, suggesting that these neurons are an important node in the cortical network responsible for communication

    Neural Correlates of Auditory Perceptual Awareness and Release from Informational Masking Recorded Directly from Human Cortex: A Case Study.

    Get PDF
    In complex acoustic environments, even salient supra-threshold sounds sometimes go unperceived, a phenomenon known as informational masking. The neural basis of informational masking (and its release) has not been well-characterized, particularly outside auditory cortex. We combined electrocorticography in a neurosurgical patient undergoing invasive epilepsy monitoring with trial-by-trial perceptual reports of isochronous target-tone streams embedded in random multi-tone maskers. Awareness of such masker-embedded target streams was associated with a focal negativity between 100 and 200 ms and high-gamma activity (HGA) between 50 and 250 ms (both in auditory cortex on the posterolateral superior temporal gyrus) as well as a broad P3b-like potential (between ~300 and 600 ms) with generators in ventrolateral frontal and lateral temporal cortex. Unperceived target tones elicited drastically reduced versions of such responses, if at all. While it remains unclear whether these responses reflect conscious perception, itself, as opposed to pre- or post-perceptual processing, the results suggest that conscious perception of target sounds in complex listening environments may engage diverse neural mechanisms in distributed brain areas

    Neural Mechanisms of Selective Auditory Attention in Rats (Dissertation)

    Get PDF
    How does attention modulate sensory representations? In order to probe the underlying neural mechanisms, we established a simple rodent model of modality-specific attention. Rats were trained to perform distinct auditory two-tone discrimination and olfactory odor discrimination in a two alternative choice (2AC) paradigm. 
To determine auditory cortex’s role in this frequency discrimination task, we used GABA-A receptor agonist muscimol to transiently and reversibly inactivate auditory cortexes bilaterally in rats performing simple interleaved auditory and olfactory discrimination. With olfactory discrimination performance serving as internal control for motivation and decision making capability, we found only auditory two-tone discrimination was selectively impaired in these rats. This shows the auditory cortex is involved in this two-tone discrimination task.
To investigate the neural correlate of modality-specific attention in the auditory cortex, we trained rats to perform interleaved auditory and olfactory blocks (of 50~70 trials each) in a single session. In auditory blocks, pure tones were either presented with or without a neutral odor (caproic acid, n=2 and 3 respectively), and subjects were rewarded for discriminating auditory stimuli. In olfactory blocks, both task odors and pure tones were presented simultaneously, and subjects were rewarded for discriminating olfactory stimuli. We recorded neural responses in primary auditory cortex (area A1) in freely moving rats while subjects performed this behavior. Single unit responses to tones were heterogeneous, and included transient, sustained, and suppressed. We found 205 of 802 units recorded responsive to the stimuli we used. Of these 205 units, 18.5% showed modality-specific attentional modulation of the anticipatory activity before tone onset. In addition, we also observed in smaller proportion of units (11.2%) modality-specific attentional modulation of the tone-evoked responses; in most cases, the responses to a particular auditory stimulus was enhanced in the auditory block (or, equivalently, suppressed in the olfactory block). Attention increased choice probability of the population in the auditory block. We have also observed significant behavior choice probability in small proportions of units. 
Our results suggest that shifting attention between audition to olfaction tasks can modulate the activity of single neurons in primary auditory cortex

    Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans

    Get PDF
    Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception

    Auditory Discrimination and Auditory Sensory Behaviours in Autism Spectrum Disorders

    Get PDF
    It has been hypothesised that auditory processing may be enhanced in autism spectrum disorders (ASD). We tested auditory discrimination ability in 72 adolescents with ASD (39 childhood autism; 33 other ASD) and 57 IQ and age-matched controls, assessing their capacity for successful discrimination of the frequency, intensity and duration differences in pairs of sounds.At the group level, auditory discrimination ability did not differ between the adolescents with and without ASD. However, we found a subgroup of 20% of individuals in the ASD group who showed ‘exceptional’ frequency discrimination skills (defined as 1.65 SDs above the control mean) and who were characterised by average intellectual ability and delayed language onset. Auditory sensory behaviours (i.e. behaviours in response to auditory sensory input) are common in ASD and we hypothesised that these would relate to auditory discrimination ability. For the ASD group, poor performers on the intensity discrimination task reported more auditory sensory behaviours associated with coping with loudness levels. Conversely, those who performed well on the duration discrimination task reported more auditory sensory behaviours across the full range measured. Frequency discrimination ability did not associate with auditory sensory behaviours. We therefore conclude that (i) enhanced frequency discrimination is present in around 1 in 5 individuals with ASD and may represent a specific phenotype; and (ii) individual differences in auditory discrimination ability in ASD may influence the expression of auditory sensory behaviours by modulating the degree to which sounds are detected or missed in the environment

    The effect of listening tasks and motor responding on activation in The auditory cortex

    Get PDF
    Previous human functional magnetic resonance imaging (fMRI) research has shown that activation in the auditory cortex (AC) is strongly modulated by motor influences. Other fMRI studies have indicated that the AC is also modulated by attention-engaging listening tasks. How these motor- and task-related activation modulations relate to each other has, however, not been previously studied. The current understanding of the functional organization of the human AC is strongly based on primate models. However, some authors have recently questioned the correspondence between the monkey and human cognitive systems, and whether the monkey AC can be used as a model for the human AC. Further, it is unknown whether active listening modulates activations similarly in the human and nonhuman primate AC. Thus, non-human primate fMRI studies are important. Yet, such fMRI studies have been previously impeded by the difficulty in teaching tasks to non-human primates. The present thesis consists of three studies in which fMRI was used both to investigate the relationship between the effects related to active listening and motor responding in the human AC and to investigate task-related activation modulations in the monkey AC. Study I investigated the effect of manual responding on activation in the human AC during auditory and visual tasks, whereas Study II focused on the question whether auditory-motor effects interact with those related to active listening tasks in the AC and adjacent regions. In Study III, a novel paradigm was developed and used during fMRI to investigate auditory task-dependent modulations in the monkey AC. The results of Study I showed that activation in the AC in humans is strongly suppressed when subjects respond to targets using precision or power grips during both visual and auditory tasks. AC activation was also modulated by grip type during the auditory task but not during the visual task (with identical stimuli and motor responses). These manual-motor effects were distinct from general attention-related modulations revealed by comparing activation during auditory and visual tasks. Study II showed that activation in widespread regions in the AC and inferior parietal lobule (IPL) depends on whether subjects respond to target vowel pairs using vocal or manual responses. Furthermore, activation in the posterior AC and the IPL depends on whether subjects respond by overtly repeating the last vowel of a target pair or by producing a given response vowel. Discrimination tasks activated superior temporal gyrus (STG) regions more strongly than 2-back tasks, while the IPL was activated more strongly by 2-back tasks. These task-related (discrimination vs. 2-back) modulations were distinct from the response type effects in the AC. However, task and motor-response-type effects interacted in the IPL. Together the results of Studies I and II support the view that operations in the AC are shaped by its connections with motor cortical regions and that regions in the posterior AC are important in auditory-motor integration. Furthermore, these studies also suggest that the task, motor-response-type and vocal-response-type effects are caused by independent mechanisms in the AC. In Study III, a novel reward-cue paradigm was developed to teach macaque monkeys to perform an auditory task. Using this paradigm monkeys learned to perform an auditory task in a few weeks, whereas in previous studies auditory task training has required months or years of training. This new paradigm was then used during fMRI to measure activation in the monkey AC during active auditory task performance. The results showed that activation in the monkey AC is modulated during this task in a similar way as previously seen in human auditory attention studies. The findings of Study III provide an important step in bridging the gap between human and animal studies of the AC.Tidigare forskning med funktionell magnetresonanstomografi (fMRI) har visat att aktiveringen i hörselhjÀrnbarken hos mÀnniskor Àr starkt pÄverkad av motoriken. Andra fMRI-studier visar att aktiveringen i hörselhjÀrnbarken ocksÄ pÄverkas av uppgifter som krÀver aktivt lyssnande. Man vet ÀndÄ inte hur dessa motoriska och uppgiftsrelaterade effekter hÀnger ihop. Den nuvarande uppfattningen om hörselhjÀrnbarkens funktionella struktur hos mÀnniskan Àr starkt pÄverkad av primatmodeller. DÀremot har en del forskare nyligen ifrÄgasatt om apors kognitiva system motsvarar mÀnniskans, och specifikt huruvida apans hörselhjÀrnbark kan anvÀndas som modell för mÀnniskans. Dessutom vet man inte om aktivt lyssnande pÄverkar aktivering i hörselhjÀrnbarken hos apor pÄ samma sÀtt som hos mÀnniskor. DÀrför Àr fMRI-studier pÄ apor viktiga. SÄdana fMRI-studier har emellertid tidigare hindrats av svÄrigheten att lÀra apor att göra uppgifter. Denna doktorsavhandling utgörs av tre studier dÀr man anvÀnde fMRI för att undersöka hur effekter som Àr relaterade till aktivt lyssnande och motorik förhÄller sig till varandra i hörselhjÀrnbarken hos mÀnniskan och hur aktiva uppgifter pÄverkar aktiveringar i hörselhjÀrnbarken hos apor. I Studie I undersöktes hur aktiveringen i hörselhjÀrnbarken hos mÀnniskan pÄverkades medan försökspersonerna utförde auditiva och visuella uppgifter och gav sina svar manuellt. Studie II fokuserade pÄ huruvida audiomotoriska effekter och effekter relaterade till aktiva hörseluppgifter samspelade i hörselhjÀrnbarken och dess omnejd. I Studie III utvecklades ett nytt försöksparadigm som sedermera anvÀndes för att undersöka auditiva uppgiftsrelaterade aktiveringar i hörselhjÀrnbarken hos apor. Resultaten av Studie I visade att aktiveringen i hörselhjÀrnbarken dÀmpas starkt nÀr försökspersonerna reagerar pÄ mÄlstimulus med precisions- och styrkegrepp bÄde vid auditiva och visuella uppgifter. Aktivering i hörselhjÀrnbarken pÄverkas ocksÄ av typen av grepp dÄ försökspersonerna utför auditiva uppgifter men inte dÄ de utför visuella uppgifter (med identiska stimuli och motoriska reaktioner). Dessa manuellt-motoriska effekter kunde sÀrskiljas frÄn allmÀnna uppmÀrksamhetsrelaterade effekter, vilka kom fram dÄ man jÀmförde aktiveringen under auditiva och visuella uppgifter. Typen av motoriska reaktioner, dvs. hur försökspersonerna reagerade pÄ mÄlstimuli (genom att reagera med hÀnderna eller att uttala ljud) pÄverkade aktiveringen i stora omrÄden i hörselhjÀrnbarken och lobulus parietale inferior (IPL) i Studie II. Aktiveringen i den bakre delen av hörselhjÀrnbarken och IPL pÄverkades ocksÄ av om försökspersonen upprepade mÄlstimulusens sista vokal eller svarade genom att uttala en given responsvokal. Diskriminationsuppgifter aktiverade gyrus temporale superior mera Àn 2-back (minnes) -uppgifter, medan IPL aktiverades mera av 2-back -uppgifterna. Dessa uppgiftsrelaterade (diskrimination vs. 2-back) pÄverkningar var oberoende av effekter som hade att göra med reaktionstypen i hörselhjÀrnbarken. DÀremot fanns det ett samspel mellan uppgift och motoriska effekter i IPL. Tillsammans stÀrker resultaten frÄn Studie I och II uppfattningen att funktioner inom hörselhjÀrnbarken Àr starkt beroende av dess sammankoppling med den motoriska hjÀrnbarken, och att bakre delarna av hörselhjÀrnbarken Àr viktiga för audiomotorisk integration. Dessa studier visar dÀrtill att uppgiftsrelaterade, motoriska och uttalsrelaterade effekter produceras av oberoende mekanismer i hörselhjÀrnbarken. I Studie III utvecklades ett nytt försöksparadigm som var baserat pÄ belöningssignaler. Med detta försöksparadigm lÀrdes makakapor att utföra en auditiv uppgift. I Studie III lÀrde sig makakaporna uppgiften inom ett par veckor, medan inlÀrningen av auditiva uppgifter i tidigare studier har tagit upp till flera Är. Detta paradigm anvÀndes sedan med hjÀlp av fMRI för att mÀta aktivering inom hörselhjÀrnbarken hos apor, medan aporna utförde aktiva auditiva uppgifter. Resultaten visar att aktiveringen i hörselhjÀrnbarken hos apor pÄverkas av uppgifter pÄ liknande sÀtt som man tidigare har visat i mÀnniskoforskning. Fynden i Studie II Àr ett viktigt framsteg för att kunna överbygga gapet mellan mÀnniskostudier och djurstudier gÀllande hörselhjÀrnbarken

    Noise-induced cochlear neuronal degeneration and its role in hyperacusis -- and tinnitus-like behavior

    Get PDF
    Thesis (Ph. D. in Speech and Hearing Bioscience and Technology)--Harvard-MIT Program in Health Sciences and Technology, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 46-57).Perceptual abnormalities such as hyperacusis and tinnitus often occur following acoustic overexposure. Although such exposure can also result in permanent threshold elevation, some individuals with noise-induced hyperacusis or tinnitus show clinically normal thresholds. Recent work in animals has shown that noise exposure can cause permanent degeneration of the cochlear nerve despite complete threshold recovery and lack of hair cell damage (Kujawa and Liberman, J Neurosci 29:14077-14085, 2009). Here we ask whether this noise-induced primary neuronal degeneration results in abnormal auditory behavior, indexed by the acoustic startle response and prepulse inhibition (PPI) of startle. Responses to tones and to broadband noise were measured in mice exposed either to a neuropathic exposure causing primary neuronal degeneration, or to a lower intensity, nonneuropathic noise, and in unexposed controls. Mice with cochlear neuronal loss displayed hyper-responsivity to sound, as evidenced by lower startle thresholds and enhanced PPI, while exposed mice without neuronal loss showed control-like responses. Gap PPI tests, often used to assess tinnitus, revealed spectrally restricted, as well as broadband, gap-detection deficits in mice with primary neuronal degeneration, but not in exposed mice without neuropathy. Crossmodal PPI tests and behavioral assays of anxiety revealed no significant differences among groups, suggesting that the changes in startle-based auditory behavior reflect a neuropathyrelated alteration specifically of auditory neural pathways. Despite significantly reduced cochlear nerve response, seen as reduced wave 1 of the auditory brainstem response, later peaks were unchanged or enhanced, suggesting neural hyperactivity in the auditory brainstem that could underlie the abnormal behavior on the startle tests. Taken together, the results suggest a role for cochlear primary neuronal degeneration in central neural excitability and, by extension, in the generation of tinnitus and hyperacusis.by Ann E. Hickox.Ph.D.in Speech and Hearing Bioscience and Technolog

    Cortical mechanisms for tinnitus in humans /

    Get PDF
    PhD ThesisThis work sought to characterise neurochemical and neurophysiological processes underlying tinnitus in humans. The first study involved invasive brain recordings from a neurosurgical patient, along with experimental manipulation of his tinnitus, to map the cortical system underlying his tinnitus. Widespread tinnitus-linked changes in low- and high-frequency oscillations were observed, along with inter-regional and cross-frequency patterns of communication. The second and third studies compared tinnitus patients to controls matched for age, sex and hearing loss, measuring auditory cortex spontaneous oscillations (with magnetoencephalography) and neurochemical concentrations (with magnetic resonance spectroscopy) respectively. Unlike in previous studies not controlled for hearing loss, there were no group differences in oscillatory activity attributable to tinnitus. However, there was a significant correlation between gamma oscillations (>30Hz) and hearing loss in the tinnitus group, and between delta oscillations (1-4Hz) and perceived tinnitus loudness. In the neurochemical study, tinnitus patients had significantly reduced GABA concentrations compared to matched controls, and within this group there was a positive correlation between choline concentration (potentially linked to acetylcholine and/or neuronal plasticity) and both hearing loss, and subjective tinnitus intensity and distress. In light of present and previous findings, tinnitus may be best explained by a predictive coding model of perception, which was tested in the final experiment. This directly controlled the three main quantities comprising predictive coding models, and found that delta/theta/alpha oscillations (1-12Hz) encoded the precision of predictions, beta oscillations (12-30Hz) encoded changes to predictions, and gamma oscillations represented surprise (unexpectedness of stimuli based on predictions). The work concludes with a predictive coding model of tinnitus that builds upon the present findings and settles unresolved paradoxes in the literature. In this, precursor processes (in varying combinations) synergise to increase the precision associated with spontaneous activity in the auditory pathway to the point where it overrides higher predictions of ‘silence’.Medical Research Council Wellcome Trust and the National Institutes of Healt

    Exploring the Structural and Functional Organization of the Dorsal Zone of Auditory Cortex in Hearing and Deafness

    Get PDF
    Recent neuroscientific research has focused on cortical plasticity, which refers to the ability of the cerebral cortex to adapt as a consequence of experience. Over the past decade, an increasing number of studies have convincingly shown that the brain can adapt to the loss or impairment of a sensory system, resulting in the expansion or heightened ability of the remaining senses. A particular region in cat auditory cortex, the dorsal zone (DZ), has been shown to mediate enhanced visual motion detection in deaf animals. The purpose of this thesis is to further our understanding of the structure and function of DZ in both hearing and deaf animals, in order to better understand how the brain compensates following insult or injury to a sensory system, with the ultimate goal of improving the utility of sensory prostheses. First, I demonstrate that the brain connectivity profile of animals with early- and late-onset deafness is similar to that of hearing animals, but the projection strength to visual brain regions involved in motion processing increases as a consequence of deafness. Second, I specifically evaluate the functional impact of the strongest auditory connections to area DZ using reversible deactivation and electrophysiological recordings. I show that projections that ultimately originate in primary auditory cortex (A1) form much of the basis of the response of DZ neurons to auditory stimulation. Third, I show that almost half of the neurons in DZ are influenced by visual or somatosensory information. I further demonstrate that this modulation by other sensory systems can have effects that are opposite in direction during different portions of the auditory response. I also show that techniques that incorporate the responses of multiple neurons, such as multi-unit and local field potential recordings, may vastly overestimate the degree to which multisensory processing occurs in a given brain region. Finally, I confirm that individual neurons in DZ become responsive mainly to visual stimulation following deafness. Together, these results shed light on the function and structural organization of area DZ in both hearing and deaf animals, and will contribute to the development of a comprehensive model of cross-modal plasticity
    • 

    corecore