55 research outputs found

    Lexical alignment in triadic communication

    Get PDF
    Lexical alignment refers to the adoption of one's interlocutor's lexical items. Accounts of the mechanisms underlying such lexical alignment differ (among other aspects) in the role assigned to addressee-centered behavior. In this study, we used a triadic communicative situation to test which factors may modulate the extent to which participants' lexical alignment reflects addressee-centered behavior. Pairs of naive participants played a picture matching game and received information about the order in which pictures were to be matched from a voice over headphones. On critical trials, participants did or did not hear a name for the picture to be matched next over headphones. Importantly, when the voice over headphones provided a name, it did not match the name that the interlocutor had previously used to describe the object. Participants overwhelmingly used the word that the voice over headphones provided.This result points to non-addressee-centered behavior and is discussed in terms of disrupting alignment with the interlocutor as well as in terms of establishing alignment with the voice over headphones. In addition, the type of picture (line drawing vs. tangram shape) independently modulated lexical alignment, such that participants showed more lexical alignment to their interlocutor for (more ambiguous) tangram shapes compared to line drawings. Overall, the results point to a rather large role for non-addressee-centered behavior during lexical alignment

    Lexical alignment in triadic communication

    Get PDF

    Inter- versus intramodal integration in sensorimotor synchronization: a combined behavioral and magnetoencephalographic study

    Get PDF
    Although the temporal occurrence of the pacing signal is predictable in sensorimotor synchronization tasks, normal subjects perform on-the-beat-tapping to an isochronous auditory metronome with an anticipatory error. This error originates from an intermodal task, that is, subjects have to bring information from the auditory and tactile modality to coincide. The aim of the present study was to illuminate whether the synchronization error is a finding specific to an intermodal timing task and whether the underlying cortical mechanisms are modality-specific or supramodal. We collected behavioral data and cortical evoked responses by magneto-encephalography (MEG) during performance of cross- and unimodal tapping-tasks. As expected, subjects showed negative asynchrony in performing an auditorily paced tapping task. However, no asynchrony emerged during tactile pacing, neither during pacing at the opposite finger nor at the toe. Analysis of cortical signals resulted in a three dipole model best explaining tap-contingent activity in all three conditions. The temporal behavior of the sources was similar between the conditions and, thus, modality independent. The localization of the two earlier activated sources was modality-independent as well whereas location of the third source varied with modality. In the auditory pacing condition it was localized in contralateral primary somatosensory cortex, during tactile pacing it was localized in contralateral posterior parietal cortex. In previous studies with auditory pacing the functional role of this third source was contradictory: A special temporal coupling pattern argued for involvement of the source in evaluating the temporal distance between tap and click whereas subsequent data gave no evidence for such an interpretation. Present data shed new light on this question by demonstrating differences between modalities in the localization of the third source with similar temporal behavior

    InfoSyll: A Syllabary Providing Statistical Information on Phonological and Orthographic Syllables

    Full text link
    here is now a growing body of evidence in various languages supporting the claim that syllables are functional units of visual word processing. In the perspective of modeling the processing of polysyllabic words and the activation of syllables, current studies investigate syllabic effects with subtle manipulations. We present here a syllabary of the French language aiming at answering new constraints when designing experiments on the syllable issue. The InfoSyll syllabary provides exhaustive characteristics and statistical information for each phonological syllable (e.g. /fi/) and for its corresponding orthographic syllables (e.g. fi, phi, phy, fee, fix, fis). Variables such as the type and token positional frequencies, the number and frequencies of the correspondences between orthographic and phonological syllables are provided. As discussed, such computations should allow precise controls, manipulations and quantitative descriptions of syllabic variables in the field of psycholinguistic research.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    How to talk to robots: Evidence from user studies on human-robot communication

    Get PDF
    Gieselmann P, Stenneken P. How to talk to robots: Evidence from user studies on human-robot communication. In: Fischer K, ed. How People Talk to Computers, Robots, and Other Artificial Communication Partners. Report Series of the Transregional Collaborative Research Center SFB/TR 8 Spatial Cognition. Vol Report No. 010-09/2006. 2006: 68-78

    Perception and Production of Rhythm

    No full text

    Cognitive and social aspects of adaptation to a communication partner

    Get PDF
    Thiele K, Foltz A, Bartels M, Stenneken P. Cognitive and social aspects of adaptation to a communication partner. In: 17th Meeting of The European Society For Cognitive Psychology. 2011.Communication is a socially highly relevant form of joint action. Adaptation of interlocutors to each other’s verbal behavior, e.g. by using identical lexical expressions or syntactic structures, is a well-studied phenomenon. Such adaptation can be found on various linguistic levels and may contribute to communicative success. But to what extent do situational aspects, cognitive capacities, social skills etc. influence adaptation on these different linguistic levels? We present a series of experiments, using the confederate scripting technique with children and adults, investigating the potential influence of cognitive factors (e.g. working memory) and social factors (e.g. interlocutor’s native language) on the strength of linguistic adaptation. Results showed that participants adapted to their interlocutor’s lexical terms, even if these were unconventional (e.g. saying telephone for a cell phone), and did so to a greater extent if their conversational partner was a non-native speaker. General language capabilities had no effect on adaptation at the syntactic level, while lower working memory capacity decreased adaptation strength. The results suggest that social-strategic and cognitive factors influence the amount of adaptation that may contribute to successful communication. In addition, top-down factors may influence adaptation behavior more strongly than general language capabilities
    • 

    corecore