29 research outputs found

    Behavioral Impact of Unisensory and Multisensory Audio-Tactile Events: Pros and Cons for Interlimb Coordination in Juggling

    Get PDF
    Recent behavioral neuroscience research revealed that elementary reactive behavior can be improved in the case of cross-modal sensory interactions thanks to underlying multisensory integration mechanisms. Can this benefit be generalized to an ongoing coordination of movements under severe physical constraints? We choose a juggling task to examine this question. A central issue well-known in juggling lies in establishing and maintaining a specific temporal coordination among balls, hands, eyes and posture. Here, we tested whether providing additional timing information about the balls and hands motions by using external sound and tactile periodic stimulations, the later presented at the wrists, improved the behavior of jugglers. One specific combination of auditory and tactile metronome led to a decrease of the spatiotemporal variability of the juggler's performance: a simple sound associated to left and right tactile cues presented antiphase to each other, which corresponded to the temporal pattern of hands movement in the juggling task. A contrario, no improvements were obtained in the case of other auditory and tactile combinations. We even found a degraded performance when tactile events were presented alone. The nervous system thus appears able to integrate in efficient way environmental information brought by different sensory modalities, but only if the information specified matches specific features of the coordination pattern. We discuss the possible implications of these results for the understanding of the neuronal integration process implied in audio-tactile interaction in the context of complex voluntary movement, and considering the well-known gating effect of movement on vibrotactile perception

    Segregated audio–tactile events destabilize the bimanual coordination of distinct rhythms

    No full text
    We examined to what extent the CNS can efficiently bind together the perception of non-coincident multimodal events with coordinated movements. To do so, we selected a bimanual coordination with left–right asymmetry, which was, achieving 3:2 polyrhythmic movements. We asked participants to synchronize left and right fingers’ movements to events presented, respectively, to the left and to the right side. In two segregated conditions, sound was presented on one side at one frequency while touch was presented on the other side at the other frequency; thus, the left and right rhythms were paced via a distinct sensory modality. In the three control conditions, the stimuli on both sides were presented via the same sensory modality: sound, touch, or coincident sound and touch. Our aim was to contrast two opposing hypotheses: Sensory segregated pacing (1) stabilizes polyrhythmic coordination because it favors the distinction between the fast and the slow rhythm versus (2) destabilizes polyrhythmic coordination because it introduces a very strong asymmetry. We performed a parametric study in which the ability to maintain the polyrhythmic coordination was explored over a broad range of pacing rates. We found that switches from the polyrhythmic coordination to an isofrequency pattern took place only in the sensory segregated conditions, at the highest frequencies. Moreover, transitions were preceded by an increase in the variability of the synchronization of movement to stimuli. We therefore propose that the destabilization originating from the asymmetry between sensory modalities overrides the assumed segregation effect. We discuss the possible neuronal underpinnings of this failure of binding of movement to segregated sound and touch

    Perceptuo-motor compatibility governs multisensory integration in bimanual coordination dynamics

    No full text
    The brain has the remarkable ability to bind together inputs from different sensory origin into a coherent percept. Behavioral benefits can result from such ability, e.g., a person typically responds faster and more accurately to cross-modal stimuli than to unimodal stimuli. To date, it is, however, largely unknown whether such multisensory benefits, shown for discrete reactive behaviors, generalize to the continuous coordination of movements. The present study addressed multisensory integration from the perspective of bimanual coordination dynamics, where the perceptual activity no longer triggers a single response but continuously guides the motor action. The task consisted in coordinating anti-symmetrically the continuous flexion–extension of the index fingers, while synchronizing with an external pacer. Three different configurations of metronome were tested, for which we examined whether a cross-modal pacing (audio–tactile beats) improved the stability of the coordination in comparison with unimodal pacing condition (auditory or tactile beats). We found a more stable bimanual coordination for cross-modal pacing, but only when the metronome configuration directly matched the anti-symmetric coordination pattern. We conclude that multisensory integration can benefit the continuous coordination of movements; however, this is constrained by whether the perceptual and motor activities match in space and time

    Articulatory constraints on spontaneous entrainment between speech and manual gesture

    No full text
    The present study examined the extent to which speech and manual gestures spontaneously entrain in a non-communicative task. Participants had to repeatedly utter nonsense /CV/ syllables while continuously moving the right index finger in flexion/extension. No instructions to coordinate were given. We manipulated the type of syllable uttered (/ba/ vs. /sa/), and vocalization (phonated vs. silent speech). Assuming principles of coordination dynamics, a stronger entrainment between the fingers oscillations and the jaw motion was predicted (1) for /ba/, due to expected larger amplitude of jaw motion and (2) in phonated speech, due to the auditory feedback. Fifteen out of twenty participants showed simple ratios of speech to finger cycles (1:1, 1:2 or 2:1). In contrast with our predictions, speech–gesture entrainment was stronger when vocalizing /sa/ than /ba/, also more widely distributed on an in-phase mode. Furthermore, results revealed a spatial anchoring and an increased temporal variability in jaw motion when producing /sa/. We suggested that this indicates a greater control of the speech articulators for /sa/, making the speech performance more receptive to environmental forces, resulting in the greater entrainment observed to gesture oscillations. The speech–gesture coordination was maintained in silent speech, suggesting a somatosensory basis for their endogenous coupling

    Representative juggling behavior in the frontal plane .

    No full text
    <p>(A) Motion of balls (solid line) and left hand movement (dashed line) in the frontal plane are presented with the localization of throws and catches (respectively circles and squares). Please note that throws and catches points and balls trajectories are not exactly coincident because the passive marker was placed on the back of the hand. (B) Representative time series of position of one ball and of position of the left hand are respectively represented in solid and in dashed lines.</p

    Spontaneous synchronization between repetitive speech and rhythmic gesture

    No full text
    Although studies have described how motion in diverse biological systems may spontaneously synchronize it is not known whether speech and gesture exhibit such a property. Previous research on the coordination of speech and gesture has focused on pointing or tapping tasks, the structure of which may regulate speech and gesture dynamics. Here we examined whether synchronies might arise between a repetitive utterance and rhythmic finger movement oscillations in a non-intentional paradigm. Participants were instructed to repeatedly utter /ba/ or /sa/ syllables with/without vocalizing, while continuously moving their right index finger in flexion/extension. No instructions about synchronization were given; participants were only told to adopt the most comfortable motions. We expected that the larger amplitude of face motion for /ba/ syllables and vocalized speech would lead to greater influence on the gesture. In contrast, the results showed more synchronization for /sa/ and when syllables were articulated silently. Less perceptive feedback may lead to a reduction in the robustness of the speech component, making it more susceptible to gesture influence

    Persistent coordination patterns in a complex task after 10 years delay

    Get PDF
    International audienceMotor learning studies have for a long time focused on performance variables (in terms of speed or accuracy) in assessing learning, transfer and retention of motor skills. We argue, however, that learning essentially resides in changes in coordination variables (in terms of qualitative organization of behavior) and that relevant tests for assessing the effectiveness of learning and retention should consider these variables. The aim of this experiment was to test the retention of a complex motor skill, after a long-term delay. Ten years ago, five participants were involved in an experiment during which they practiced for 39 sessions of ten 1-min trials on a ski-simulator. All participants volunteered for a retention test, ten years after, for one session of ten 1-min trials. Analyses focused on the oscillations of the platform of the simulator. Performance was assessed in terms of amplitude and frequency. Coordination was accounted for by an analysis of dynamical properties of the motion of the platform, and especially the nature of the damping function that was exploited for sustaining the limit cycle dynamics. Results showed a significant decrement in performance variables. In contrast, all participants adopted from the first trialonwards the coordination mode they learned 10 years ago. These results confirm the strong persistence of coordination modes, once acquired and stabilized in the behavioral repertoire. They also support the importance of coordination variables for a valid assessment of learning and retention

    Juggling pattern in the frontal plane .

    No full text
    <p> Hand trajectories are represented with a double line, and ball trajectories with a single line. Ball trajectories are presented during ball flight time (BFT) : the trajectory of a ball thrown by the left hand is drawn with a solid line, the trajectory of a ball thrown by the right hand is drawn with a dashed line (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0032308#pone-0032308-t002" target="_blank">Table 2</a> for variables details).</p

    Variability of throw velocity (ThVel) for each metronome condition.

    No full text
    <p>The average variability for each individual in the control condition defined the individual baseline variability, which was subtracted to the individual's average variability in each metronome condition. Thus, the zero corresponds to the baseline variability without metronome. Negative values indicates smaller variability than in the control condition. Error bars represent inter-participant standard deviation. The grey bar indicates a significant increase in the variability of throw velocity in the Tact. Doub. metronome condition.</p

    Uncontrolled manifold and Juggling: Retrieving a set of Controlled Variables from Data

    No full text
    In this paper we analyze the concept of UnControlled Manifold (UCM), that consists in the kinematic variables that are not controlled by the user, being not relevant to the task. We proceed testing a set of controlled variables inspired by the literature about tracking task, then we propose a procedure to identify them on the basis of captured data. We are interested in the analysis of behavior in a Virtual Environment and in the real world. In particular we analyze the three ball cascade juggling and its simulation through a platform named Light Weight Juggling focusing on the task of ball tossing. Users arm kinematics is represented as a robotic manipulator with 7 degrees of freedom. Joint angles are retrieved through an optical tracking system. The variables controlled in the virtual environment are a subset of the ones controlled in the real world, that leads to an UM that differs from the one in the real world. A comparison between the statistics computed in the two cases is performed to explore behavioral differences in the two cases
    corecore