8 research outputs found

    Experience with crossmodal statistics reduces the sensitivity for audio-visual temporal asynchrony

    Get PDF
    Habets B, Bruns P, Röder B. Experience with crossmodal statistics reduces the sensitivity for audio-visual temporal asynchrony. Scientific Reports. 2017;7(1): 1486

    Neurophysiological correlates of linearization in language production

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>During speech production the planning of a description of several events requires, among other things, a verbal sequencing of these events. During this process, referred to as linearization during conceptualization, the speaker can choose between different types of temporal connectives such as 'Before' X did A, Y did B' or 'After' Y did B, X did A'. To capture the neural events of such linearization processes, event-related potentials (ERP) were measured in native speakers of German. Utterances were elicited by presenting a sequence of two pictures on a video screen. Each picture consists of an object that is associated with a particular action (e.g. book = reading). A coloured vocalization cue indicated to describe the sequence of two actions associated with the objects in chronological (e.g. red cue: 'After' I drove the car, I read a book) or reversed order (yellow cue).</p> <p>Results</p> <p>Brain potentials showed reliable differences between the two conditions from 180 ms after the onset of the vocalization prompt, with ERPs from the 'After' condition being more negative. This 'Before/After' difference showed a fronto-central distribution between 180 and 230 ms. From 300 ms onwards, a parietal distribution was observed. The latter effect is interpreted as an instance of the P300 response, which is known to be modulated by task difficulty.</p> <p>Conclusion</p> <p>ERPs preceding overt sentence production are sensitive to conceptual linearization. The observed early, more fronto-centrally distributed variation could be interpreted as involvement of working memory needed to order the events according to the instruction. The later parietal distributed variation relates to the complexity in linearization, with the non-chronological order being more demanding during the updating of the concepts in working memory.</p

    Neural Processes underlying Conceptualization in Speech Production

    No full text
    Habets B. Neural Processes underlying Conceptualization in Speech Production.; 2007

    Protracted development of visuo-proprioceptive integration for uni- and bimanual motor coordination

    No full text
    Martel M, Ossandón JP, Habets B, Heed T. Protracted development of visuo-proprioceptive integration for uni- and bimanual motor coordination. bioRxiv. 2019

    The role of synchrony and ambiguity in speech-gesture integration during comprehension

    Get PDF
    Contains fulltext : 90615.pdf (publisher's version ) (Open Access)Abstract During face-to-face communication, one does not only hear speech but also see a speaker's communicative hand movements. It has been shown that such hand gestures play an important role in communication where the two modalities influence each other's interpretation. A gesture typically temporally overlaps with coexpressive speech, but the gesture is often initiated before (but not after) the coexpressive speech. The present ERP study investigated what degree of asynchrony in the speech and gesture onsets are optimal for semantic integration of the concurrent gesture and speech. Videos of a person gesturing were combined with speech segments that were either semantically congruent or incongruent with the gesture. Although gesture and speech always overlapped in time, gesture and speech were presented with three different degrees of asynchrony. In the SOA 0 condition, the gesture onset and the speech onset were simultaneous. In the SOA 160 and 360 conditions, speech was delayed by 160 and 360 msec, respectively. ERPs time locked to speech onset showed a significant difference between semantically congruent versus incongruent gesture-speech combinations on the N400 for the SOA 0 and 160 conditions. No significant difference was found for the SOA 360 condition. These results imply that speech and gesture are integrated most efficiently when the differences in onsets do not exceed a certain time span because of the fact that iconic gestures need speech to be disambiguated in a way relevant to the speech context

    Neural Correlates of Conceptualization Difficulty during the Preparation of Complex Utterances

    Get PDF
    Marek A, Habets B, Jansma B, Nager W, Münte T. Neural Correlates of Conceptualization Difficulty during the Preparation of Complex Utterances. Aphasiology. 2007;21(12):1147-1156

    Illusory tactile movement crosses arms and legs and is coded in external space

    No full text
    Martel M, Fuchs X, Trojan J, Gockel V, Habets B, Heed T. Illusory tactile movement crosses arms and legs and is coded in external space. Cortex. 2022;149:202-225.Humans often misjudge where on the body a touch occurred. Theoretical accounts have ascribed such misperceptions to local interactions in peripheral and primary somatosensory neurons, positing that spatial-perceptual mechanisms adhere to limb boundaries and skin layout. Yet, perception often reflects integration of sensory signals with prior experience. On their trajectories, objects often touch multiple limbs; therefore, body-environment interactions should manifest in perceptual mechanisms that reflect external space. Here, we demonstrate that humans perceived the cutaneous rabbit illusion - the percept of multiple identical stimuli as hopping across the skin - along the Euclidian trajectory between stimuli on two body parts and regularly mislocalized stimuli from one limb to the other. A Bayesian model based on Euclidian, as opposed to anatomical, distance faithfully reproduced key aspects of participants' localization behavior. Our results suggest that prior experience of touch in space critically shapes tactile spatial perception and illusions beyond anatomical organization. Copyright © 2022 The Authors. Published by Elsevier Ltd.. All rights reserved

    Others' Actions Reduce Crossmodal Integration in Peripersonal Space

    No full text
    Item does not contain fulltextSpecific mechanisms integrate visual-tactile information close to the body to guide voluntary action [1, 2] and to enable rapid self-defense in peripersonal space [3-5]. In social interactions, others frequently act in one's peripersonal space, thereby changing the relevance of near-body events for one's own actions. Such changes of stimulus relevance may thus affect visual-tactile integration. Here we show that crossmodal processing in peripersonal space is reduced for perceptual events that another person acts upon. Participants performed a visual-tactile interference task [6] in which spatially incongruent visual distractors in the peripersonal space are known to interfere with judging the location of a tactile stimulus [7-10]. Participants performed the task both alone and with a partner who responded to the visual distractors. Performing the task together reduced the crossmodal interference effect on tactile judgments, but only if the partner occupied the participant's peripersonal space (experiment 1) and if she responded to all, rather than only a subset, of the visual distractors (experiment 2). These results show that others' actions can modulate multisensory integration in peripersonal space in a top-down fashion. Such modulations may serve to guide voluntary action and to allow others' actions in a space of self-defense
    corecore