271 research outputs found

    Reading in two writing systems: Accommodation and assimilation of the brain's reading network

    Get PDF
    Bilingual reading can require more than knowing two languages. Learners must acquire also the writing conventions of their second language, which can differ in its deep mapping principles (writing system) and its visual configurations (script). We review ERP (event-related potential) and fMRI studies of both Chinese-English bilingualism and Chinese second language learning that bear on the system accommodation hypothesis: the neural networks acquired for one system must be modified to accommodate the demands of a new system. ERP bilingual studies demonstrate temporal indicators of the brain's experience with L1 and L2 and with the frequency of encounters of words in L2. ERP learning studies show that early visual processing differences between L1 and L2 diminish during a second term of study. fMRI studies of learning converge in finding that learners recruit bilateral occipital-temporal and also middle frontal areas when reading Chinese, similar to the pattern of native speakers and different from alphabetic reading. The evidence suggests an asymmetry: alphabetic readers have a neural network that accommodates the demands of Chinese by recruiting neural structures less needed for alphabetic reading. Chinese readers have a neural network that partly assimilates English into the Chinese system, especially in the visual stages of word identification. © Cambridge University Press 2007.published_or_final_versio

    Enhanced activation of the left inferior frontal gyrus in deaf and dyslexic adults during rhyming

    Get PDF
    Hearing developmental dyslexics and profoundly deaf individuals both have difficulties processing the internal structure of words (phonological processing) and learning to read. In hearing non-impaired readers, the development of phonological representations depends on audition. In hearing dyslexics, many argue, auditory processes may be impaired. In congenitally profoundly deaf individuals, auditory speech processing is essentially absent. Two separate literatures have previously reported enhanced activation in the left inferior frontal gyrus in both deaf and dyslexic adults when contrasted with hearing non-dyslexics during reading or phonological tasks. Here, we used a rhyme judgement task to compare adults from these two special populations to a hearing non-dyslexic control group. All groups were matched on non-verbal intelligence quotient, reading age and rhyme performance. Picture stimuli were used since this requires participants to generate their own phonological representations, rather than have them partially provided via text. By testing well-matched groups of participants on the same task, we aimed to establish whether previous literatures reporting differences between individuals with and without phonological processing difficulties have identified the same regions of differential activation in these two distinct populations. The data indicate greater activation in the deaf and dyslexic groups than in the hearing non-dyslexic group across a large portion of the left inferior frontal gyrus. This includes the pars triangularis, extending superiorly into the middle frontal gyrus and posteriorly to include the pars opercularis, and the junction with the ventral precentral gyrus. Within the left inferior frontal gyrus, there was variability between the two groups with phonological processing difficulties. The superior posterior tip of the left pars opercularis, extending into the precentral gyrus, was activated to a greater extent by deaf than dyslexic participants, whereas the superior posterior portion of the pars triangularis extending into the ventral pars opercularis, was activated to a greater extent by dyslexic than deaf participants. Whether these regions play differing roles in compensating for poor phonological processing is not clear. However, we argue that our main finding of greater inferior frontal gyrus activation in both groups with phonological processing difficulties in contrast to controls suggests greater reliance on the articulatory component of speech during phonological processing when auditory processes are absent (deaf group) or impaired (dyslexic group). Thus, the brain appears to develop a similar solution to a processing problem that has different antecedents in these two populations

    Structural correlates of semantic and phonemic fluency ability in first and second languages

    Get PDF
    Category and letter fluency tasks are commonly used clinically to investigate the semantic and phonological processes central to speech production, but the neural correlates of these processes are difficult to establish with functional neuroimaging because of the relatively unconstrained nature of the tasks. This study investigated whether differential performance on semantic (category) and phonemic (letter) fluency in neurologically normal participants was reflected in regional gray matter density. The participants were 59 highly proficient speakers of 2 languages. Our findings corroborate the importance of the left inferior temporal cortex in semantic relative to phonemic fluency and show this effect to be the same in a first language (L1) and second language (L2). Additionally, we show that the pre-supplementary motor area (pre-SMA) and head of caudate bilaterally are associated with phonemic more than semantic fluency, and this effect is stronger for L2 than L1 in the caudate nuclei. To further validate these structural results, we reanalyzed previously reported functional data and found that pre-SMA and left caudate activation was higher for phonemic than semantic fluency. On the basis of our findings, we also predict that lesions to the pre-SMA and caudate nuclei may have a greater impact on phonemic than semantic fluency, particularly in L2 speakers

    Subthalamic Nucleus and Sensorimotor Cortex Activity During Speech Production

    Get PDF
    The sensorimotor cortex is somatotopically organized to represent the vocal tract articulators such as lips, tongue, larynx, and jaw. How speech and articulatory features are encoded at the subcortical level, however, remains largely unknown. We analyzed LFP recordings from the subthalamic nucleus (STN) and simultaneous electrocorticography recordings from the sensorimotor cortex of 11 human subjects (1 female) with Parkinson´s disease during implantation of deep-brain stimulation (DBS) electrodes while they read aloud three-phoneme words. The initial phonemes involved either articulation primarily with the tongue (coronal consonants) or the lips (labial consonants). We observed significant increases in high-gamma (60?150 Hz) power in both the STN and the sensorimotor cortex that began before speech onset and persisted for the duration of speech articulation. As expected from previous reports, in the sensorimotor cortex, the primary articulators involved in the production of the initial consonants were topographically represented by high-gamma activity. We found that STN high-gamma activity also demonstrated specificity for the primary articulator, although no clear topography was observed. In general, subthalamic high-gamma activity varied along the ventral?dorsal trajectory of the electrodes, with greater high-gamma power recorded in the dorsal locations of the STN. Interestingly, the majority of significant articulator-discriminative activity in the STN occurred before that in sensorimotor cortex. These results demonstrate that articulator-specific speech information is contained within high-gamma activity of the STN, but with different spatial and temporal organization compared with similar information encoded in the sensorimotor cortex.Fil: Chrabaszcz, Anna. University of Pittsburgh; Estados UnidosFil: Neumann, Wolf Julian. Universität zu Berlin; AlemaniaFil: Stretcu, Otilia. University of Pittsburgh; Estados UnidosFil: Lipski, Witold J.. University of Pittsburgh; Estados UnidosFil: Dastolfo Hromack, Christina A.. University of Pittsburgh; Estados UnidosFil: Bush, Alan. University of Pittsburgh; Estados Unidos. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Física de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Física de Buenos Aires; ArgentinaFil: Wang, Dengyu. Tsinghua University; China. University of Pittsburgh; Estados UnidosFil: Crammond, Donald J.. University of Pittsburgh; Estados UnidosFil: Shaiman, Susan. University of Pittsburgh; Estados UnidosFil: Dickey, Michael W.. University of Pittsburgh; Estados UnidosFil: Holt, Lori L.. University of Pittsburgh; Estados UnidosFil: Turner, Robert S.. University of Pittsburgh; Estados UnidosFil: Fiez, Julie A.. University of Pittsburgh; Estados UnidosFil: Richardson, R. Mark. University of Pittsburgh; Estados Unido

    The Impact of Second Language Learning on Semantic and Nonsemantic First Language Reading

    Get PDF
    The relationship between orthography (spelling) and phonology (speech sounds) varies across alphabetic languages. Consequently, learning to read a second alphabetic language, that uses the same letters as the first, increases the phonological associations that can be linked to the same orthographic units. In subjects with English as their first language, previous functional imaging studies have reported increased left ventral prefrontal activation for reading words with spellings that are inconsistent with their orthographic neighbors (e.g., PINT) compared with words that are consistent with their orthographic neighbors (e.g., SHIP). Here, using functional magnetic resonance imaging (fMRI) in 17 Italian–English and 13 English–Italian bilinguals, we demonstrate that left ventral prefrontal activation for first language reading increases with second language vocabulary knowledge. This suggests that learning a second alphabetic language changes the way that words are read in the first alphabetic language. Specifically, first language reading is more reliant on both lexical/semantic and nonlexical processing when new orthographic to phonological mappings are introduced by second language learning. Our observations were in a context that required participants to switch between languages. They motivate future fMRI studies to test whether first language reading is also altered in contexts when the second language is not in use

    A Specialized Odor Memory Buffer in Primary Olfactory Cortex

    Get PDF
    The neural substrates of olfactory working memory are unknown. We addressed the questions of whether olfactory working memory involves a verbal representation of the odor, or a sensory image of the odor, or both, and the location of the neural substrates of these processes.We used functional magnetic resonance imaging to measure activity in the brains of subjects who were remembering either nameable or unnameable odorants. We found a double dissociation whereby remembering nameable odorants was reflected in sustained activity in prefrontal language areas, and remembering unnameable odorants was reflected in sustained activity in primary olfactory cortex.These findings suggest a novel dedicated mechanism in primary olfactory cortex, where odor information is maintained in temporary storage to subserve ongoing tasks
    corecore