15 research outputs found

    Three People Can Synchronize as Coupled Oscillators during Sports Activities

    Get PDF
    We experimentally investigated the synchronized patterns of three people during sports activities and found that the activity corresponded to spatiotemporal patterns in rings of coupled biological oscillators derived from symmetric Hopf bifurcation theory, which is based on group theory. This theory can provide catalogs of possible generic spatiotemporal patterns irrespective of their internal models. Instead, they are simply based on the geometrical symmetries of the systems. We predicted the synchronization patterns of rings of three coupled oscillators as trajectories on the phase plane. The interactions among three people during a 3 vs. 1 ball possession task were plotted on the phase plane. We then demonstrated that two patterns conformed to two of the three patterns predicted by the theory. One of these patterns was a rotation pattern (R) in which phase differences between adjacent oscillators were almost 2π/3. The other was a partial anti-phase pattern (PA) in which the two oscillators were anti-phase and the third oscillator frequency was dead. These results suggested that symmetric Hopf bifurcation theory could be used to understand synchronization phenomena among three people who communicate via perceptual information, not just physically connected systems such as slime molds, chemical reactions, and animal gaits. In addition, the skill level in human synchronization may play the role of the bifurcation parameter

    The Virtual Teacher (VT) Paradigm: Learning New Patterns of Interpersonal Coordination Using the Human Dynamic Clamp

    Get PDF
    The Virtual Teacher paradigm, a version of the Human Dynamic Clamp (HDC), is introduced into studies of learning patterns of inter-personal coordination. Combining mathematical modeling and experimentation, we investigate how the HDC may be used as a Virtual Teacher (VT) to help humans co-produce and internalize new inter-personal coordination pattern(s). Human learners produced rhythmic finger movements whilst observing a computer-driven avatar, animated by dynamic equations stemming from the well-established Haken-Kelso-Bunz (1985) and Schöner-Kelso (1988) models of coordination. We demonstrate that the VT is successful in shifting the pattern co-produced by the VT-human system toward any value (Experiment 1) and that the VT can help humans learn unstable relative phasing patterns (Experiment 2). Using transfer entropy, we find that information flow from one partner to the other increases when VT-human coordination loses stability. This suggests that variable joint performance may actually facilitate interaction, and in the long run learning. VT appears to be a promising tool for exploring basic learning processes involved in social interaction, unraveling the dynamics of information flow between interacting partners, and providing possible rehabilitation opportunities

    The joint flanker effect: sharing tasks with real and imagined co-actors

    Get PDF
    Contains fulltext : 99810.pdf (publisher's version ) (Open Access)The Eriksen flanker task (Eriksen and Eriksen in Percept Psychophys 16:143-149, 1974) was distributed among pairs of participants to investigate whether individuals take into account a co-actor's S-R mapping even when coordination is not required. Participants responded to target letters (Experiment 1) or colors (Experiment 2) surrounded by distractors. When performing their part of the task next to another person performing the complementary part of the task, participants responded more slowly to stimuli containing flankers that were potential targets for their co-actor (incompatible trials), compared to stimuli containing identical, compatible, or neutral flankers. This joint Flanker effect also occurred when participants merely believed to be performing the task with a co-actor (Experiment 3). Furthermore, Experiment 4 demonstrated that people form shared task representations only when they perceive their co-actor as intentionally controlling her actions. These findings substantiate and generalize earlier results on shared task representations and advance our understanding of the basic mechanisms subserving joint action

    A Cognitive Neuroscience Perspective On Bimanual Coordination And Interference

    No full text
    We argue that bimanual coordination and interference depends critically on how these actions are represented on a cognitive level. We first review the literature on spatial interactions, focusing on the difference between movements directed at visual targets and movements cued symbolically. Interactions manifest during response planning are limited to the latter condition. These results suggest that interactions in the formation of the trajectories of the two hands are associated with processes involved in response selection, rather than interactions in the motor system. Neuropsychological studies involving callosotomy patients argue that these interactions arise from transcallosal interactions between cortically-based spatial codes. The second half of the chapter examines temporal constraints observed in bimanual movements. We propose that most bimanual movements are marked by a common event structure, an explicit representation that ensures temporal coordination of the movements. The translation of an abstract event structure into a movement with a particular timing pattern is associated with cerebellar function, although the resulting temporal coupling during bimanual movements may be due to the operation of other subcortical mechanisms. For rhythmic movements that do not entail an event structure, timing may be an emergent property. Under such conditions, both spatial and temporal coupling can be absent. The emphasis on abstract levels of constraint makes clear that limitations in bimanual coordination overlap to a considerable degree with those observed in other domains of cognition

    Segregated audio–tactile events destabilize the bimanual coordination of distinct rhythms

    No full text
    We examined to what extent the CNS can efficiently bind together the perception of non-coincident multimodal events with coordinated movements. To do so, we selected a bimanual coordination with left–right asymmetry, which was, achieving 3:2 polyrhythmic movements. We asked participants to synchronize left and right fingers’ movements to events presented, respectively, to the left and to the right side. In two segregated conditions, sound was presented on one side at one frequency while touch was presented on the other side at the other frequency; thus, the left and right rhythms were paced via a distinct sensory modality. In the three control conditions, the stimuli on both sides were presented via the same sensory modality: sound, touch, or coincident sound and touch. Our aim was to contrast two opposing hypotheses: Sensory segregated pacing (1) stabilizes polyrhythmic coordination because it favors the distinction between the fast and the slow rhythm versus (2) destabilizes polyrhythmic coordination because it introduces a very strong asymmetry. We performed a parametric study in which the ability to maintain the polyrhythmic coordination was explored over a broad range of pacing rates. We found that switches from the polyrhythmic coordination to an isofrequency pattern took place only in the sensory segregated conditions, at the highest frequencies. Moreover, transitions were preceded by an increase in the variability of the synchronization of movement to stimuli. We therefore propose that the destabilization originating from the asymmetry between sensory modalities overrides the assumed segregation effect. We discuss the possible neuronal underpinnings of this failure of binding of movement to segregated sound and touch
    corecore