Neural systems underlying spatial language in American Sign Language

Abstract

A [15O]water PET experiment was conducted to in-vestigate the neural regions engaged in processing constructions unique to signed languages: classifier predicates in which the position of the hands in sign-ing space schematically represents spatial relations among objects. Ten deaf native signers viewed line drawings depicting a spatial relation between two ob-jects (e.g., a cup on a table) and were asked either to produce a classifier construction or an American Sign Language (ASL) preposition that described the spatial relation or to name the figure object (colored red). Compared to naming objects, describing spatial rela-tionshipswith classifier constructions engaged the su-pramarginal gyrus (SMG) within both hemispheres. Compared to naming objects, naming spatial relations with ASL prepositions engaged only the right SMG. Previous research indicates that retrieval of English prepositions engages both right and left SMG, but more inferiorly than for ASL classifier constructions. Compared to ASL prepositions, naming spatial rela-tions with classifier constructions engaged left infe-rior temporal (IT) cortex, a region activated when naming concrete objects in either ASL or English. Left IT may be engaged because the handshapes in classi-fier constructions encode information about object type (e.g., flat surface). Overall, the results suggest more right hemisphere involvement when expressing spatial relations in ASL, perhaps because signing space is used to encode the spatial relationship be-tween objects. © 2002 Elsevier Science (USA

Similar works

Full text

thumbnail-image

CiteSeerX

redirect
Last time updated on 29/10/2017

This paper was published in CiteSeerX.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.