124 research outputs found

    Multimodal Processing Of Emotional Meanings: A Hypothesis On The Adaptive Value Of Prosody

    No full text
    Humans combine multiple sources of information to comprehend meanings. These sources can be characterized as linguistic (i.e., lexical units and/or sentences) or paralinguistic (e.g. body posture, facial expression, voice intonation, pragmatic context). Emotion communication is a special case in which linguistic and paralinguistic dimensions can simultaneously denote the same, or multiple incongruous referential meanings. Think, for instance, about when someone says “I’m sad!”, but does so with happy intonation and a happy facial expression. Here, the communicative channels express very specific (although conflicting) emotional states as denotations. In such cases of intermodal incongruence, are we involuntarily biased to respond to information in one channel over the other? We hypothesize that humans are involuntary biased to respond to prosody over verbal content and facial expression, since the ability to communicate socially relevant information such as basic emotional states through prosodic modulation of the voice might have provided early hominins with an adaptive advantage that preceded the emergence of segmental speech (Darwin 1871; Mithen, 2005). To address this hypothesis, we examined the interaction between multiple communicative channels in recruiting attentional resources, within a Stroop interference task (i.e. a task in which different channels give conflicting information; Stroop, 1935). In experiment 1, we used synonyms of “happy” and “sad” spoken with happy and sad prosody. Participants were asked to identify the emotion expressed by the verbal content while ignoring prosody (Word task) or vice versa (Prosody task). Participants responded faster and more accurately in the Prosody task. Within the Word task, incongruent stimuli were responded to more slowly and less accurately than congruent stimuli. In experiment 2, we adopted synonyms of “happy” and “sad” spoken in happy and sad prosody, while a happy or sad face was displayed. Participants were asked to identify the emotion expressed by the verbal content while ignoring prosody and face (Word task), to identify the emotion expressed by prosody while ignoring verbal content and face (Prosody task), or to identify the emotion expressed by the face while ignoring prosody and verbal content (Face task). Participants responded faster in the Face task and less accurately when the two non-focused channels were expressing an emotion that was incongruent with the focused one, as compared with the condition where all the channels were congruent. In addition, in the Word task, accuracy was lower when prosody was incongruent to verbal content and face, as compared with the condition where all the channels were congruent. Our data suggest that prosody interferes with emotion word processing, eliciting automatic responses even when conflicting with both verbal content and facial expressions at the same time. In contrast, although processed significantly faster than prosody and verbal content, faces alone are not sufficient to interfere in emotion processing within a three-dimensional Stroop task. Our findings align with the hypothesis that the ability to communicate emotions through prosodic modulation of the voice – which seems to be dominant over verbal content - is evolutionary older than the emergence of segmental articulation (Mithen, 2005; Fitch, 2010). This hypothesis fits with quantitative data suggesting that prosody has a vital role in the perception of well-formed words (Johnson & Jusczyk, 2001), in the ability to map sounds to referential meanings (Filippi et al., 2014), and in syntactic disambiguation (Soderstrom et al., 2003). This research could complement studies on iconic communication within visual and auditory domains, providing new insights for models of language evolution. Further work aimed at how emotional cues from different modalities are simultaneously integrated will improve our understanding of how humans interpret multimodal emotional meanings in real life interactions

    More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing

    No full text
    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of “happy” and “sad” were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of “happy” and “sad” were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed

    Humans Recognize Vocal Expressions Of Emotional States Universally Across Species

    No full text
    The perception of danger in the environment can induce physiological responses (such as a heightened state of arousal) in animals, which may cause measurable changes in the prosodic modulation of the voice (Briefer, 2012). The ability to interpret the prosodic features of animal calls as an indicator of emotional arousal may have provided the first hominins with an adaptive advantage, enabling, for instance, the recognition of a threat in the surroundings. This ability might have paved the ability to process meaningful prosodic modulations in the emerging linguistic utterances

    Kissing right? On the consistency of the head-turning bias in kissing

    Get PDF
    The present study investigated the consistency of the head-turning bias in kissing. In particular we addressed what happens if a person who prefers to kiss with the head turned to the right kisses a person who prefers to kiss with the head turned to the left. To this end, participants (N=57) were required to kiss a life-sized doll's head rotated in different orientations that were either compatible or incompatible with the participants' head-turning preference. Additionally, participants handedness, footedness, and eye preference was assessed. Results showed that a higher percentage of participants preferred to kiss with their head turned to the right than to the left. In addition, the right-turners were more consistent in their kissing behaviour than left-turners. That is, with the doll's head rotated in an incompatible direction, right-turners were less likely to switch their head to their non-preferred side. Since no clear relationships between head-turning bias and the other lateral preferences (i.e., handedness, footedness, and eye preference) were discerned, the more consistent head-turning bias among right-turners could not be explained as deriving from a joint pattern of lateral preferences that is stronger among individuals with rightward as compared to individuals with leftward lateral preferences. © 2010 Psychology Press

    Measuring paw preferences in dogs, cats and rats: Design requirements and innovations in methodology

    Get PDF
    Data availability statement: This is a technical paper, and no new data is reported, therefore a data availability statement is not applicable.Studying behavioural lateralization in animals holds great potential for answering important questions in laterality research and clinical neuroscience. However, comparative research encounters challenges in reliability and validity, requiring new approaches and innovative designs to overcome. Although validated tests exist for some species, there is yet no standard test to compare lateralized manual behaviours between individuals, populations, and animal species. One of the main reasons is that different fine-motor abilities and postures must be considered for each species. Given that pawedness/handedness is a universal marker for behavioural lateralization across species, this article focuses on three commonly investigated species in laterality research: dogs, cats, and rats. We will present six apparatuses (two for dogs, three for cats, and one for rats) that enable an accurate assessment of paw preference. Design requirements and specifications such as zoometric fit for different body sizes and ages, reliability, robustness of the material, maintenance during and after testing, and animal welfare are extremely important when designing a new apparatus. Given that the study of behavioural lateralization yields crucial insights into animal welfare, laterality research, and clinical neuroscience, we aim to provide a solution to these challenges by presenting design requirements and innovations in methodology across species.The dog part of this study was supported by The Scientific and Technological Research Council of Turkey (TUBITAK) 1001 grant (no: 118O445). Author Sevim Isparta was supported by the Turkish Scientific and Technological Research Council (TUBITAK) through 2214-A International Research Fellowship Program for PhD Students and 2211/A General Domestic Doctorate Scholarship Program. TĂŒrkiye Bilimsel ve Teknolojik AraƟtırma Kurumu

    Event-Related Potential Correlates of Performance-Monitoring in a Lateralized Time-Estimation Task

    Get PDF
    Performance-monitoring as a key function of cognitive control covers a wide range of diverse processes to enable goal directed behavior and to avoid maladjustments. Several event-related brain potentials (ERP) are associated with performance-monitoring, but their conceptual background differs. For example, the feedback-related negativity (FRN) is associated with unexpected performance feedback and might serve as a teaching signal for adaptational processes, whereas the error-related negativity (ERN) is associated with error commission and subsequent behavioral adaptation. The N2 is visible in the EEG when the participant successfully inhibits a response following a cue and thereby adapts to a given stop-signal. Here, we present an innovative paradigm to concurrently study these different performance-monitoring-related ERPs. In 24 participants a tactile time-estimation task interspersed with infrequent stop-signal trials reliably elicited all three ERPs. Sensory input and motor output were completely lateralized, in order to estimate any hemispheric processing preferences for the different aspects of performance monitoring associated with these ERPs. In accordance with the literature our data suggest augmented inhibitory capabilities in the right hemisphere given that stop-trial performance was significantly better with left- as compared to right-hand stop-signals. In line with this, the N2 scalp distribution was generally shifted to the right in addition to an ipsilateral shift in relation to the response hand. Other than that, task lateralization affected neither behavior related to error and feedback processing nor ERN or FRN. Comparing the ERP topographies using the Global Map Dissimilarity index, a large topographic overlap was found between all considered components.With an evenly distributed set of trials and a split-half reliability for all ERP components ≄.85 the task is well suited to efficiently study N2, ERN, and FRN concurrently which might prove useful for group comparisons, especially in clinical populations

    A large-scale study on the effects of sex on gray matter asymmetry

    Get PDF
    Research on sex-related brain asymmetries has not yielded consistent results. Despite its importance to further understanding of normal brain development and mental disorders, the field remains relatively unexplored. Here we employ a recently developed asymmetry measure, based on the Dice coefficient, to detect sex-related gray matter asymmetries in a sample of 457 healthy participants (266 men and 191 women) obtained from 5 independent databases. Results show that women’s brains are more globally symmetric than men’s (p < 0.001). Although the new measure accounts for asymmetries distributed all over the brain, several specific structures were identified as systematically more symmetric in women, such as the thalamus and the cerebellum, among other structures, some of which are typically involved in language production. These sex-related asymmetry differences may be defined at the neurodevelopmental stage and could be associated with functional and cognitive sex differences, as well as with proneness to develop a mental disorder

    No evidence that footedness in pheasants influences cognitive performance in tasks assessing colour discrimination and spatial ability

    Get PDF
    The differential specialization of each side of the brain facilitates the parallel processing of information and has been documented in a wide range of animals. Animals that are more lateralized as indicated by consistent preferential limb use are commonly reported to exhibit superior cognitive ability as well as other behavioural advantages.We assayed the lateralization of 135 young pheasants (Phasianus colchicus), indicated by their footedness in a spontaneous stepping task, and related this measure to individual performance in either 3 assays of visual or spatial learning and memory. We found no evidence that pronounced footedness enhances cognitive ability in any of the tasks. We also found no evidence that an intermediate footedness relates to better cognitive performance. This lack of relationship is surprising because previous work revealed that pheasants have a slight population bias towards right footedness, and when released into the wild, individuals with higher degrees of footedness were more likely to die. One explanation for why extreme lateralization is constrained was that it led to poorer cognitive performance, or that optimal cognitive performance was associated with some intermediate level of lateralization. This stabilizing selection could explain the pattern of moderate lateralization that is seen in most non-human species that have been studied. However, we found no evidence in this study to support this explanation
    • 

    corecore