7 research outputs found

    Sign and speech share partially overlapping conceptual representations

    Get PDF
    Conceptual knowledge is fundamental to human cognition. Yet the extent to which it is influenced by language is unclear. Studies of semantic processing show that similar neural patterns are evoked by the same concepts presented in different modalities (e.g. spoken words and pictures or text) [1–3]. This suggests that conceptual representations are ‘modality independent’. However, an alternative possibility is that the similarity reflects retrieval of common spoken language representations. Indeed, in hearing spoken language users, text and spoken language are co-dependent [4,5] and pictures are encoded via visual and verbal routes [6]. A parallel approach investigating semantic cognition, shows that bilinguals activate similar patterns for the same words in their different languages [7,8]. This suggests that conceptual representations are ‘language independent’. However, this has only been tested in spoken language bilinguals. If different languages evoke different conceptual representations, this should be most apparent comparing languages that differ greatly in structure. Hearing people with signing deaf parents are bilingual in sign and speech: languages conveyed in different modalities. Here we test the influence of modality and bilingualism on conceptual representation by comparing semantic representations elicited by spoken British English and British Sign Language in hearing early, sign-speech bilinguals. We show that representations of semantic categories are shared for sign and speech, but not for individual spoken words and signs. This provides evidence for partially shared representations for sign and speech, and shows that language acts as a subtle filter through which we understand and interact with the world

    Inconsistent language lateralisation - testing the dissociable language laterality hypothesis using behaviour and lateralised cerebral blood flow

    Get PDF
    Background Most people have strong left-brain lateralisation for language, with a minority showing right- or bilateral language representation. On some receptive language tasks, however, lateralisation appears to be reduced or absent. This contrasting pattern raises the question of whether and how language laterality may fractionate within individuals. Building on our prior work, we postulated (a) that there can be dissociations in lateralisation of different components of language, and (b) these would be more common in left-handers. A subsidiary hypothesis was that laterality indices will cluster according to two underlying factors corresponding to whether they involve generation of words or sentences, vs. receptive language. Methods We tested these predictions in two stages: At Step 1 an online laterality battery (Dichotic listening, Rhyme Decision and Word Comprehension) was given to 621 individuals (56% left-handers); At Step 2, functional transcranial Doppler ultrasound (fTCD) was used with 230 of these individuals (51% left-handers). 108 left-handers and 101 right-handers gave usable data on a battery of three language generation and three receptive language tasks. Results Neither the online nor fTCD measures supported the notion of a single language laterality factor. In general, for both online and fTCD measures, tests of language generation were left-lateralised. In contrast, the receptive tasks were at best weakly left-lateralised or, in the case of Word Comprehension, slightly right-lateralised. The online measures were only weakly correlated, if at all, with fTCD measures. Most of the fTCD measures had split-half reliabilities of at least .7, and showed a distinctive pattern of intercorrelation, supporting a modified two-factor model in which Phonological Decision (generation) and Sentence Decision (reception) loaded on both factors. The same factor structure fitted data from left- and right-handers, but mean scores on the two factors were lower (less left-lateralised) in left-handers. Conclusions There are at least two factors influencing language lateralization in individuals, but they do not correspond neatly to language generation and comprehension. Future fMRI studies could help clarify how far they reflect activity in specific brain regions

    Deaf readers benefit from lexical feedback during orthographic processing

    Get PDF
    It has been proposed that poor reading abilities in deaf readers might be related to weak connections between the orthographic and lexical-semantic levels of processing. Here we used event related potentials (ERPs), known for their excellent time resolution, to examine whether lexical feedback modulates early orthographic processing. Twenty congenitally deaf readers made lexical decisions to target words and pseudowords. Each of those target stimuli could be preceded by a briefly presented matched-case or mismatched-case identity prime (e.g., ALTAR-ALTAR vs. altar- ALTAR). Results showed an early effect of case overlap at the N/P150 for all targets. Critically, this effect disappeared for words but not for pseudowords, at the N250—an ERP component sensitive to orthographic processing. This dissociation in the effect of case for word and pseudowords targets provides strong evidence of early automatic lexical-semantic feedback modulating orthographic processing in deaf readers. Interestingly, despite the dissociation found in the ERP data, behavioural responses to words still benefited from the physical overlap between prime and target, particularly in less skilled readers and those with less experience with words. Overall, our results support the idea that skilled deaf readers have a stronger connection between the orthographic and the lexical-semantic levels of processing

    Cerebral lateralisation of first and second languages in bilinguals assessed using functional transcranial Doppler ultrasound

    Get PDF
    Background: Lateralised language processing is a well-established finding in monolinguals. In bilinguals, studies using fMRI have typically found substantial regional overlap between the two languages, though results may be influenced by factors such as proficiency, age of acquisition and exposure to the second language. Few studies have focused specifically on individual differences in brain lateralisation, and those that have suggested reduced lateralisation may characterise representation of the second language (L2) in some bilingual individuals. Methods: In Study 1, we used functional transcranial Doppler sonography (FTCD) to measure cerebral lateralisation in both languages in high proficiency bilinguals who varied in age of acquisition (AoA) of L2. They had German (N = 14) or French (N = 10) as their first language (L1) and English as their second language. FTCD was used to measure task-dependent blood flow velocity changes in the left and right middle cerebral arteries during phonological word generation cued by single letters. Language history measures and handedness were assessed through self-report. Study 2 followed a similar format with 25 Japanese (L1) /English (L2) bilinguals, with proficiency in their second language ranging from basic to advanced, using phonological and semantic word generation tasks with overt speech production. Results: In Study 1, participants were significantly left lateralised for both L1 and L2, with a high correlation (r = .70) in the size of laterality indices for L1 and L2. In Study 2, again there was good agreement between LIs for the two languages (r = .77 for both word generation tasks). There was no evidence in either study of an effect of age of acquisition, though the sample sizes were too small to detect any but large effects. Conclusion: In proficient bilinguals, there is strong concordance for cerebral lateralisation of first and second language as assessed by a verbal fluency task

    Inconsistent language lateralisation – testing the dissociable language laterality hypothesis using behaviour and lateralised cerebral blood flow

    No full text
    Background Most people have strong left-brain lateralisation for language, with a minority showing right- or bilateral language representation. On some receptive language tasks, however, lateralisation appears to be reduced or absent. This contrasting pattern raises the question of whether and how language laterality may fractionate within individuals. Building on our prior work, we postulated (a) that there can be dissociations in lateralisation of different components of language, and (b) these would be more common in left-handers. A subsidiary hypothesis was that laterality indices will cluster according to two underlying factors corresponding to whether they involve generation of words or sentences, versus receptive language. Methods We tested these predictions in two stages: At Step 1 an online laterality battery (Dichotic listening, Rhyme Decision and Word Comprehension) was given to 621 individuals (56% left-handers); At Step 2, functional transcranial Doppler ultrasound (fTCD) was used with 230 of these individuals (51% left-handers). 108 left-handers and 101 right-handers gave useable data on a battery of three language generation and three receptive language tasks. Results Neither the online nor fTCD measures supported the notion of a single language laterality factor. In general, for both online and fTCD measures, tests of language generation were left-lateralised. In contrast, the receptive tasks were at best weakly left-lateralised or, in the case of Word Comprehension, slightly right-lateralised. The online measures were only weakly correlated, if at all, with fTCD measures. Most of the fTCD measures had split-half reliabilities of at least .7, and showed a distinctive pattern of intercorrelation, supporting a modified two-factor model in which Phonological Decision (generation) and Sentence Decision (reception) loaded on both factors. The same factor structure fitted data from left- and right-handers, but mean scores on the two factors were lower (less left-lateralised) in left-handers. Conclusions There are at least two factors influencing language lateralization in individuals, but they do not correspond neatly to language generation and comprehension. Future fMRI studies could help clarify how far they reflect activity in specific brain regions
    corecore