5 research outputs found

    Opposite patterns of hemisphere dominance for early auditory processing of lexical tones and consonants

    No full text
    In tonal languages such as Mandarin Chinese, a lexical tone carries semantic information and is preferentially processed in the left brain hemisphere of native speakers as revealed by the functional MRI or positron emission tomography studies, which likely measure the temporally aggregated neural events including those at an attentive stage of auditory processing. Here, we demonstrate that early auditory processing of a lexical tone at a preattentive stage is actually lateralized to the right hemisphere. We frequently presented to native Mandarin Chinese speakers a meaningful auditory word with a consonant-vowel structure and infrequently varied either its lexical tone or initial consonant using an odd-ball paradigm to create a contrast resulting in a change in word meaning. The lexical tone contrast evoked a stronger preattentive response, as revealed by whole-head electric recordings of the mismatch negativity, in the right hemisphere than in the left hemisphere, whereas the consonant contrast produced an opposite pattern. Given the distinct acoustic features between a lexical tone and a consonant, this opposite lateralization pattern suggests the dependence of hemisphere dominance mainly on acoustic cues before speech input is mapped into a semantic representation in the processing stream

    Optical Brain Imaging Reveals General Auditory and Language-Specific Processing in Early Infant Development

    No full text
    This study uses near-infrared spectroscopy in young infants in order to elucidate the nature of functional cerebral processing for speech. Previous imaging studies of infants’ speech perception revealed left-lateralized responses to native language. However, it is unclear if these activations were due to language per se rather than to some low-level acoustic correlate of spoken language. Here we compare native (L1) and non-native (L2) languages with 3 different nonspeech conditions including emotional voices, monkey calls, and phase scrambled sounds that provide more stringent controls. Hemodynamic responses to these stimuli were measured in the temporal areas of Japanese 4 month-olds. The results show clear left-lateralized responses to speech, prominently to L1, as opposed to various activation patterns in the nonspeech conditions. Furthermore, implementing a new analysis method designed for infants, we discovered a slower hemodynamic time course in awake infants. Our results are largely explained by signal-driven auditory processing. However, stronger activations to L1 than to L2 indicate a language-specific neural factor that modulates these responses. This study is the first to discover a significantly higher sensitivity to L1 in 4 month-olds and reveals a neural precursor of the functional specialization for the higher cognitive network

    Spectro-temporal modulation transfer function of single voxels in the human auditory cortex measured with high-resolution fMRI

    No full text
    Are visual and auditory stimuli processed by similar mechanisms in the human cerebral cortex? Images can be thought of as light energy modulations over two spatial dimensions, and low-level visual areas analyze images by decomposition into spatial frequencies. Similarly, sounds are energy modulations over time and frequency, and they can be identified and discriminated by the content of such modulations. An obvious question is therefore whether human auditory areas, in direct analogy to visual areas, represent the spectro-temporal modulation content of acoustic stimuli. To answer this question, we measured spectro-temporal modulation transfer functions of single voxels in the human auditory cortex with functional magnetic resonance imaging. We presented dynamic ripples, complex broadband stimuli with a drifting sinusoidal spectral envelope. Dynamic ripples are the auditory equivalent of the gratings often used in studies of the visual system. We demonstrate selective tuning to combined spectro-temporal modulations in the primary and secondary auditory cortex. We describe several types of modulation transfer functions, extracting different spectro-temporal features, with a high degree of interaction between spectral and temporal parameters. The overall low-pass modulation rate preference of the cortex matches the modulation content of natural sounds. These results demonstrate that combined spectro-temporal modulations are represented in the human auditory cortex, and suggest that complex signals are decomposed and processed according to their modulation content, the same transformation used by the visual system

    Global versus local processing of frequency-modulated tones in gerbils: An animal model of lateralized auditory cortex functions

    No full text
    Hemispheric asymmetries of speech and music processing might arise from more basic specializations of left and right auditory cortex (AC). It is not clear, however, whether such asymmetries are unique to humans, i.e., consequences of speech and music, or whether comparable lateralized AC functions exist in nonhuman animals, as evolutionary precursors. Here, we investigated the cortical lateralization of perception of linearly frequency-modulated (FM) tones in gerbils, a rodent species with human-like low-frequency hearing. Using a footshock-reinforced shuttle-box avoidance go/no-go procedure in a total of 178 gerbils, we found that (i) the discrimination of direction of continuous FM (rising versus falling sweeps, 250-ms duration) was impaired by right but not left AC lesions; (ii) the discrimination of direction of segmented FM (50-ms segments, 50-ms silent gaps, total duration 250 ms) was impaired by bilateral but not unilateral AC lesions; (iii) the discrimination of gap durations (10–30 ms) in segmented FM was impaired by left but not right AC lesions. AC lesions before and after training resulted in similar effects. Together, these experiments suggest that right and left AC, even in rodents, use different strategies in analyzing FM stimuli. Thus, the right AC, by using global cues, determines the direction of continuous and segmented FM but cannot discriminate gap durations. The left AC, by using local cues, discriminates gap durations and determines FM direction only when additional segmental information is available
    corecore