37 research outputs found

    How hand movements and speech tip the balance in cognitive development:A story about children, complexity, coordination, and affordances

    Get PDF
    When someone asks us to explain something, such as how a lever or balance scale works, we spontaneously move our hands and gesture. This is also true for children. Furthermore, children use their hands to discover things and to find out how something works. Previous research has shown that children’s hand movements hereby are ahead of speech, and play a leading role in cognitive development. Explanations for this assumed that cognitive understanding takes place in one’s head, and that hand movements and speech (only) reflect this. However, cognitive understanding arises and consists of the constant interplay between (hand) movements and speech, and someone’s physical and social environment. The physical environment includes task properties, for example, and the social environment includes other people. Therefore, I focused on this constant interplay between hand movements, speech, and the environment, to better understand hand movements’ role in cognitive development. Using science and technology tasks, we found that children’s speech affects hand movements more than the other way around. During difficult tasks the coupling between hand movements and speech becomes even stronger than in easy tasks. Interim changes in task properties differently affect hand movements and speech. Collaborating children coordinate their hand movements and speech, and even their head movements together. The coupling between hand movements and speech is related to age and (school) performance. It is important that teachers attend to children’s hand movements and speech, and arrange their lessons and classrooms such that there is room for both

    Movers and shakers of cognition:Hand movements, speech, task properties, and variability

    Get PDF
    Children move their hands to explore, learn and communicate about hands-on tasks. Their hand movements seem to be “learning” ahead of speech. Children shape their hand movements in accordance with spatial and temporal task properties, such as when they feel an object or simulate its movements. Their speech does not directly correspond to these spatial and temporal task properties, however. We aimed to understand whether and how hand movements' are leading cognitive development due to their ability to correspond to spatiotemporal task properties, while speech is unable to do so. We explored whether hand movements' and speech's variability changed with a change in spatiotemporal task properties, using two variability measures: Diversity indicates adaptation, while Complexity indicates flexibility to adapt. In two experiments, we asked children (4–7 years) to predict and explain about balance scale problems, whereby we either manipulated the length of the balance scale or the mass of the weights after half of the trials. In three out of four conditions, we found a change in Complexity for both hand movements and speech between first and second half of the task. In one of these conditions, we found a relation between the differences in Complexity and Diversity of hand movements and speech. Changes in spatiotemporal task properties thus often influenced both hand movements' and speech's flexibility, but there seem to be differences in how they did so. We provided many directions for future research, to further unravel the relations between hand movements, speech, task properties, variability, and cognitive development

    Asymmetric coupling between gestures and speech during reasoning

    No full text
    When children learn, insights displayed in gestures typically precede insights displayed in speech. In this study, we investigated how this leading role of gestures in cognitive development is evident in (and emerges from) the dynamic coupling between gestures and speech during one task. We investigated 12 children (Mage = 5.4 years) from Kindergarten and first grade who performed an air pressure task. Children’s gestures and speech were coded from video recordings, and levels of reasoning, based on Skill Theory, were assigned. To analyze the dynamic coupling between gestures and speech, Cross Recurrence Quantification Analysis was performed on the two coupled time series. We found gestures to be ahead of speech for children in Kindergarten, but speech and gestures were more temporally aligned for first graders. Furthermore, we found speech to affect gestures more than vice versa for all children, but the degree of this asymmetry of bidirectional regulation differed. In Kindergarten, a higher score on language tests was related to more asymmetry between gestures and speech, while for first graders this relation was present for higher, within-task, levels of understanding. A more balanced, i.e. less asymmetric, coupling between gestures and speech was found to be related to a higher score on math and past tasks, though. Our findings suggest that the relation between gestures, speech and cognitive development is more subtle than previously thought. Specifically, the nature of the coupling between gestures and speech not only expresses but might also predict learning differences between children, both within and across learning domains. We hope our study will foster future research on learning as a dynamic, embodied and embedded process
    corecore