1,150 research outputs found
Sex Differences in Language First Appear in Gesture
Children differ in how quickly they reach linguistic milestones. Boys typically produce their first multi-word sentences later than girls do. We ask here whether there are sex differences in childrenâs gestures that precede, and presage, these sex differences in speech. To explore this question, we observed 22 girls and 18 boys every 4 months as they progressed from one-word speech to multi-word speech. We found that boys not only produced speech + speech (S+S) combinations (âdrink juiceâ) 3 months later than girls, but they also produced gesture + speech (G+S) combinations expressing the same types of semantic relations (âeatâ + point at cookie) 3 months later than girls. Because G+S combinations are produced earlier than S+S combinations, childrenâs gestures provide the first sign that boys are likely to lag behind girls in the onset of sentence constructions
Recommended from our members
Differences in Early Gesture Explain SES Disparities in Child Vocabulary Size at School Entry
Children from lowâsocioeconomic status (SES) families, on average, arrive at school with smaller
vocabularies than children from high-SES families. In an effort to identify precursors to, and
possible remedies for, this inequality, we videotaped 50 children from families with a range of
different SES interacting with parents at 14 months and assessed their vocabulary skills at 54
months. We found that children from high-SES families frequently used gesture to communicate at
14 months, a relation that was explained by parent gesture use (with speech controlled). In turn,
the fact that children from high-SES families have large vocabularies at 54 months was explained
by childrenâs gesture use at 14 months. Thus, differences in early gesture help to explain the
disparities in vocabulary that children bring with them to school
Recommended from our members
Early gesture selectively predicts later language learning
The gestures children produce predict the early stages of spoken language development. Here we ask whether gesture is a global
predictor of language learning, or whether particular gestures predict particular language outcomes. We observed 52 children
interacting with their caregivers at home, and found that gesture use at 18 months
selectively
predicted lexical versus syntactic
skills at 42 months, even with early child speech controlled. Specifically, number of different meanings conveyed in gesture at
18 months predicted vocabulary at 42 months, but number of gesture+speech combinations did not. In contrast, number of
gesture
+
speech combinations, particularly those conveying sentence-like ideas, produced at 18 months predicted sentence
complexity at 42 months, but meanings conveyed in gesture did not. We can thus predict particular milestones in vocabulary
and sentence complexity at age 3 and a half by watching how children move their hands two years earlier
Regularization of word order in the verb phrase differs from the noun phrase:Evidence from an online silent gesture perception paradigm
Prior work has shown a ânaturalâ preference in the Verb Phrase for direct object Nouns to linearly precede the Verb. There is also evidence of a ânaturalâ preference in the Noun Phrase to order Nouns before Adjectives. Given this, we asked how domain-general biases like regularization and language-specific biases like the preference for ânaturalâ orders could jointly contribute to the emergence of these two common word orders cross-linguistically. Using a silent gesture paradigm (in which we presented iconic gestures without speech), we exposed different participants to competing Verb Phrase (NounVerb vs. VerbNoun) and Noun Phrase (NounAdj vs. AdjNoun) word orders at varying frequencies. In Noun Phrase contrast conditions, we found that regularization was greatest when the domain-general bias towards regularization and the linguistic bias to order Nouns before Adjectives were aligned. In Verb Phrase conditions, participants regularized to the same extent regardless of input: They opted for greater regularity, even at the expense of aligning with underlying word order biases. We discuss the implications of our work for understanding the effects of domain-general biases on language
Do Iconic Gestures Pave the Way for Childrenâs Early Verbs?
Children produce a deictic gesture for a particular object (point at dog) approximately 3 months before they produce the verbal label for that object (âdogâ; Iverson & Goldin-Meadow, 2005). Gesture thus paves the way for children\u27s early nouns. We ask here whether the same pattern of gesture preceding and predicting speech holds for iconic gestures. In other words, do gestures that depict actions precede and predict early verbs? We observed spontaneous speech and gestures produced by 40 children (22 girls, 18 boys) from age 14 to 34 months. Children produced their first iconic gestures 6 months later than they produced their first verbs. Thus, unlike the onset of deictic gestures, the onset of iconic gestures conveying action meanings followed, rather than preceded, children\u27s first verbs. However, iconic gestures increased in frequency at the same time as verbs did and, at that time, began to convey meanings not yet expressed in speech. Our findings suggest that children can use gesture to expand their repertoire of action meanings, but only after they have begun to acquire the verb system underlying their language
Assessing knowledge conveyed in gesture: Do teachers have the upper hand?
Children's gestures can reveal important information about their problem-solving strategies. This study investigated whether the information children express only in gesture is accessible to adults not trained in gesture coding. Twenty teachers and 20 undergraduates viewed videotaped vignettes of 12 children explaining their solutions to equations. Six children expressed the same strategy in speech and gesture, and 6 expressed different strategies. After each vignette, adults described the child's reasoning. For children who expressed different strategies in speech and gesture, both teachers and undergraduates frequently described strategies that children had not expressed in speech. These additional strategies could often be traced to the children's gestures. Sensitivity to gesture was comparable for teachers and undergraduates. Thus, even without training, adults glean information, not only from children's words but also from their hands
Gesturing with an Injured Brain: How Gesture Helps Children with Early Brain Injury Learn Linguistic Constructions
Children with pre/perinatal unilateral brain lesions (PL) show remarkable plasticity for language development. Is this plasticity characterized by the same developmental trajectory that characterizes typically developing (TD) children, with gesture leading the way into speech ? We explored this question, comparing eleven children with PL â matched to thirty TD children on expressive vocabulary â in the second year of life. Children with PL showed similarities to TD children for simple but not complex sentence types. Children with PL produced simple sentences across gesture and speech several months before producing them entirely in speech, exhibiting parallel delays in both gesture+speech and speech-alone. However, unlike TD children, children with PL produced complex sentence types ïŹrst in speech-alone. Overall, the gestureâspeech system appears to be a robust feature of language learning for simple â but not complex â sentence constructions, acting as a harbinger of change in language development even when that language is developing in an injured brain
Recommended from our members
Gesture production and comprehension in children with specific language impairment
Children with specific language impairment (SLI) have difficulties with spoken language. However, some recent research suggests that these impairments reflect underlying cognitive limitations. Studying gesture may inform us clinically and theoretically about the nature of the association between language and cognition. A total of 20 children with SLI and 19 typically developing (TD) peers were assessed on a novel measure of gesture production. Children were also assessed for sentence comprehension errors in a speech-gesture integration task. Children with SLI performed equally to peers on gesture production but performed less well when comprehending integrated speech and gesture. Error patterns revealed a significant group interaction: children with SLI made more gesture-based errors, whilst TD children made semantically based ones. Children with SLI accessed and produced lexically encoded gestures despite having impaired spoken vocabulary and this group also showed stronger associations between gesture and language than TD children. When SLI comprehension breaks down, gesture may be relied on over speech, whilst TD children have a preference for spoken cues. The findings suggest that for children with SLI, gesture scaffolds are still more related to language development than for TD peers who have out-grown earlier reliance on gestures. Future clinical implications may include standardized assessment of symbolic gesture and classroom based gesture support for clinical groups
- âŠ