307 research outputs found

    Speechreading in Deaf Adults with Cochlear Implants: Evidence for Perceptual Compensation

    Get PDF
    Previous research has provided evidence for a speechreading advantage in congenitally deaf adults compared to hearing adults. A ‘perceptual compensation’ account of this finding proposes that prolonged early onset deafness leads to a greater reliance on visual, as opposed to auditory, information when perceiving speech which in turn results in superior visual speech perception skills in deaf adults. In the current study we tested whether previous demonstrations of a speechreading advantage for profoundly congenitally deaf adults with hearing aids, or no amplificiation, were also apparent in adults with the same deafness profile but who have experienced greater access to the auditory elements of speech via a cochlear implant (CI). We also tested the prediction that, in line with the perceptual compensation account, receiving a CI at a later age is associated with superior speechreading skills due to later implanted individuals having experienced greater dependence on visual speech information. We designed a speechreading task in which participants viewed silent videos of 123 single words spoken by a model and were required to indicate which word they thought had been said via a free text response. We compared congenitally deaf adults who had received CIs in childhood or adolescence (N = 15) with a comparison group of hearing adults (N = 15) matched on age and education level. The adults with CI showed significantly better scores on the speechreading task than the hearing comparison group. Furthermore, within the group of adults with CI, there was a significant positive correlation between age at implantation and speechreading performance; earlier implantation was associated with lower speechreading scores. These results are both consistent with the hypothesis of perceptual compensation in the domain of speech perception, indicating that more prolonged dependence on visual speech information in speech perception may lead to improvements in the perception of visual speech. In addition our study provides metrics of the ‘speechreadability’ of 123 words produced in British English: one derived from hearing adults (N = 61) and one from deaf adults with CI (N = 15). Evidence for the validity of these ‘speechreadability’ metrics come from correlations with visual lexical competition data

    How auditory experience differentially influences the function of left and right superior temporal cortices

    Get PDF
    To investigate how hearing status, sign language experience and task demands influence functional responses in the human superior temporal cortices (STC) we collected fMRI data from deaf and hearing participants (male and female), who either acquired sign language early or late in life. Our stimuli in all tasks were pictures of objects. We varied the linguistic and visuospatial processing demands in three different tasks that involved decisions about (1) the sublexical (phonological) structure of the British Sign Language (BSL) signs for the objects; (2) the semantic category of the objects; and (3) the physical features of the objects. Neuroimaging data revealed that in participants who were deaf from birth, STC showed increased activation during visual processing tasks. Importantly, this differed across hemispheres. Right STC was consistently activated regardless of the task whereas left STC was sensitive to task demands. Significant activation was detected in the left STC only for the BSL phonological task. This task, we argue, placed greater demands on visuospatial processing than the other two tasks. In hearing signers, enhanced activation was absent in both left and right STC during all three tasks. Lateralisation analyses demonstrated that the effect of deafness was more task-dependent in the left than the right STC whereas it was more task-independent in the right than the left STC. These findings indicate how the absence of auditory input from birth leads to dissociable and altered functions of left and right STC in deaf participants

    How auditory experience differentially influences the function of left and right superior temporal cortices

    Get PDF
    To investigate how hearing status, sign language experience and task demands influence functional responses in the human superior temporal cortices (STC) we collected fMRI data from deaf and hearing participants (male and female), who either acquired sign language early or late in life. Our stimuli in all tasks were pictures of objects. We varied the linguistic and visuospatial processing demands in three different tasks that involved decisions about (1) the sublexical (phonological) structure of the British Sign Language (BSL) signs for the objects; (2) the semantic category of the objects; and (3) the physical features of the objects.Neuroimaging data revealed that in participants who were deaf from birth, STC showed increased activation during visual processing tasks. Importantly, this differed across hemispheres. Right STC was consistently activated regardless of the task whereas left STC was sensitive to task demands. Significant activation was detected in the left STC only for the BSL phonological task. This task, we argue, placed greater demands on visuospatial processing than the other two tasks. In hearing signers, enhanced activation was absent in both left and right STC during all three tasks. Lateralisation analyses demonstrated that the effect of deafness was more task-dependent in the left than the right STC whereas it was more task-independent in the right than the left STC. These findings indicate how the absence of auditory input from birth leads to dissociable and altered functions of left and right STC in deaf participants.SIGNIFICANCE STATEMENTThose born deaf can offer unique insights into neuroplasticity, in particular in regions of superior temporal cortex (STC) that primarily respond to auditory input in hearing people. Here we demonstrate that in those deaf from birth the left and the right STC have altered and dissociable functions. The right STC is activated regardless of demands on visual processing. In contrast, the left STC is sensitive to the demands of visuospatial processing. Furthermore, hearing signers, with the same sign language experience as the deaf participants, did not activate the STCs. Our data advance current understanding of neural plasticity by determining the differential effects that hearing status and task demands can have on left and right STC function

    The impact of early language exposure on the neural system supporting language in deaf and hearing adults

    Get PDF
    Deaf late signers provide a unique perspective on the impact of impoverished early language exposure on the neurobiology of language: insights that cannot be gained from research with hearing people alone. Here we contrast the effect of age of sign language acquisition in hearing and congenitally deaf adults to examine the potential impact of impoverished early language exposure on the neural systems supporting a language learnt later in life. We collected fMRI data from deaf and hearing proficient users (N = 52) of British Sign Language (BSL), who learnt BSL either early (native) or late (after the age of 15 years) whilst they watched BSL sentences or strings of meaningless nonsense signs. There was a main effect of age of sign language acquisition (late > early) across deaf and hearing signers in the occipital segment of the left intraparietal sulcus. This finding suggests that late learners of sign language may rely on visual processing more than early learners, when processing both linguistic and nonsense sign input - regardless of hearing status. Region-of-interest analyses in the posterior superior temporal cortices (STC) showed an effect of age of sign language acquisition that was specific to deaf signers. In the left posterior STC, activation in response to signed sentences was greater in deaf early signers than deaf late signers. Importantly, responses in the left posterior STC in hearing early and late signers did not differ, and were similar to those observed in deaf early signers. These data lend further support to the argument that robust early language experience, whether signed or spoken, is necessary for left posterior STC to show a 'native-like' response to a later learnt language

    Sign and speech share partially overlapping conceptual representations

    Get PDF
    Conceptual knowledge is fundamental to human cognition. Yet the extent to which it is influenced by language is unclear. Studies of semantic processing show that similar neural patterns are evoked by the same concepts presented in different modalities (e.g. spoken words and pictures or text) [1–3]. This suggests that conceptual representations are ‘modality independent’. However, an alternative possibility is that the similarity reflects retrieval of common spoken language representations. Indeed, in hearing spoken language users, text and spoken language are co-dependent [4,5] and pictures are encoded via visual and verbal routes [6]. A parallel approach investigating semantic cognition, shows that bilinguals activate similar patterns for the same words in their different languages [7,8]. This suggests that conceptual representations are ‘language independent’. However, this has only been tested in spoken language bilinguals. If different languages evoke different conceptual representations, this should be most apparent comparing languages that differ greatly in structure. Hearing people with signing deaf parents are bilingual in sign and speech: languages conveyed in different modalities. Here we test the influence of modality and bilingualism on conceptual representation by comparing semantic representations elicited by spoken British English and British Sign Language in hearing early, sign-speech bilinguals. We show that representations of semantic categories are shared for sign and speech, but not for individual spoken words and signs. This provides evidence for partially shared representations for sign and speech, and shows that language acts as a subtle filter through which we understand and interact with the world

    Cochlear implantation (CI) for prelingual deafness: the relevance of studies of brain organization and the role of first language acquisition in considering outcome success.

    Get PDF
    Cochlear implantation (CI) for profound congenital hearing impairment, while often successful in restoring hearing to the deaf child, does not always result in effective speech processing. Exposure to non-auditory signals during the pre-implantation period is widely held to be responsible for such failures. Here, we question the inference that such exposure irreparably distorts the function of auditory cortex, negatively impacting the efficacy of CI. Animal studies suggest that in congenital early deafness there is a disconnection between (disordered) activation in primary auditory cortex (A1) and activation in secondary auditory cortex (A2). In humans, one factor contributing to this functional decoupling is assumed to be abnormal activation of A1 by visual projections-including exposure to sign language. In this paper we show that that this abnormal activation of A1 does not routinely occur, while A2 functions effectively supramodally and multimodally to deliver spoken language irrespective of hearing status. What, then, is responsible for poor outcomes for some individuals with CI and for apparent abnormalities in cortical organization in these people? Since infancy is a critical period for the acquisition of language, deaf children born to hearing parents are at risk of developing inefficient neural structures to support skilled language processing. A sign language, acquired by a deaf child as a first language in a signing environment, is cortically organized like a heard spoken language in terms of specialization of the dominant perisylvian system. However, very few deaf children are exposed to sign language in early infancy. Moreover, no studies to date have examined sign language proficiency in relation to cortical organization in individuals with CI. Given the paucity of such relevant findings, we suggest that the best guarantee of good language outcome after CI is the establishment of a secure first language pre-implant-however that may be achieved, and whatever the success of auditory restoration

    Triangleland. I. Classical dynamics with exchange of relative angular momentum

    Full text link
    In Euclidean relational particle mechanics, only relative times, relative angles and relative separations are meaningful. Barbour--Bertotti (1982) theory is of this form and can be viewed as a recovery of (a portion of) Newtonian mechanics from relational premises. This is of interest in the absolute versus relative motion debate and also shares a number of features with the geometrodynamical formulation of general relativity, making it suitable for some modelling of the problem of time in quantum gravity. I also study similarity relational particle mechanics (`dynamics of pure shape'), in which only relative times, relative angles and {\sl ratios of} relative separations are meaningful. This I consider firstly as it is simpler, particularly in 1 and 2 d, for which the configuration space geometry turns out to be well-known, e.g. S^2 for the `triangleland' (3-particle) case that I consider in detail. Secondly, the similarity model occurs as a sub-model within the Euclidean model: that admits a shape--scale split. For harmonic oscillator like potentials, similarity triangleland model turns out to have the same mathematics as a family of rigid rotor problems, while the Euclidean case turns out to have parallels with the Kepler--Coulomb problem in spherical and parabolic coordinates. Previous work on relational mechanics covered cases where the constituent subsystems do not exchange relative angular momentum, which is a simplifying (but in some ways undesirable) feature paralleling centrality in ordinary mechanics. In this paper I lift this restriction. In each case I reduce the relational problem to a standard one, thus obtain various exact, asymptotic and numerical solutions, and then recast these into the original mechanical variables for physical interpretation.Comment: Journal Reference added, minor updates to References and Figure

    Does congenital deafness affect the structural and functional architecture of primary visual cortex?

    Get PDF
    Deafness results in greater reliance on the remaining senses. It is unknown whether the cortical architecture of the intact senses is optimized to compensate for lost input. Here we performed widefield population receptive field (pRF) mapping of primary visual cortex (V1) with functional magnetic resonance imaging (fMRI) in hearing and congenitally deaf participants, all of whom had learnt sign language after the age of 10 years. We found larger pRFs encoding the peripheral visual field of deaf compared to hearing participants. This was likely driven by larger facilitatory center zones of the pRF profile concentrated in the near and far periphery in the deaf group. pRF density was comparable between groups, indicating pRFs overlapped more in the deaf group. This could suggest that a coarse coding strategy underlies enhanced peripheral visual skills in deaf people. Cortical thickness was also decreased in V1 in the deaf group. These findings suggest deafness causes structural and functional plasticity at the earliest stages of visual cortex
    corecore