Behavior matching in multimodal communication is synchronized

Abstract

A variety of theoretical frameworks predict the resemblance of behaviors between two people engaged in communication, in the form of coordination, mimicry, or alignment. However, little is known about the time course of the behavior matching, even though there is evidence that dyads synchronize oscillatory motions (e.g., postural sway). This study examined the temporal structure of nonoscillatory actions—language, facial, and gestural behaviors—produced during a route commu-nication task. The focus was the temporal relationship between matching behaviors in the interlocu-tors (e.g., facial behavior in one interlocutor vs. the same facial behavior in the other interlocutor). Cross-recurrence analysis revealed that within each category tested (language, facial, gestural), interlocutors synchronized matching behaviors, at temporal lags short enough to provide imitation of one interlocutor by the other, from one conversational turn to the next. Both social and cognitive variables predicted the degree of temporal organization. These findings suggest that the temporal structure of matching behaviors provides low-level and low-cost resources for human interaction

Similar works

Full text

thumbnail-image

CiteSeerX

redirect
Last time updated on 12/04/2017

This paper was published in CiteSeerX.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.