145 research outputs found

    The function of fear chemosignals: Preparing for danger

    Get PDF
    It has been shown that the presence of conspecifics modulates human's vigilance strategies as is the case with animal species. Mere presence has been found to reduce vigilance. However, animal research has also shown that chemosignals (e.g., sweat) produced during fear-inducing situations modulates individuals' threat detection strategies. In the case of humans, little is known about how exposure to conspecifics' fear chemosignals modulates vigilance and threat detection effectiveness. The present study (N= 59) examined how human fear chemosignals affect vigilance strategies and threat avoidance in its receivers. We relied on a paradigm that simulates a "foraging under threat" situation in the lab, integrated with an eye-tracker to examine the attention allocation. Our results showed that the exposure to fear chemosignals (vs. rest chemosignals and a no-sweat condition) while not changing vigilance behavior leads to faster answers to threatening events. In conclusion, fear chemosignals seem to constitute an important warning signal for human beings, possibly leading its receiver to a readiness state that allows faster reactions to threat-related events.Fundação para a Ciência e Tecnologia - FCTinfo:eu-repo/semantics/acceptedVersio

    Perceiving emotions in visual stimuli: social verbal context facilitates emotion detection of words but not of faces

    Get PDF
    Building on the notion that processing of emotional stimuli is sensitive to context, in two experimental tasks we explored whether the detection of emotion in emotional words (task 1) and facial expressions (task 2) is facilitated by social verbal context. Three different levels of contextual supporting information were compared, namely (1) no information, (2) the verbal expression of an emotionally matched word pronounced with a neutral intonation, and (3) the verbal expression of an emotionally matched word pronounced with emotionally matched intonation. We found that increasing levels of supporting contextual information enhanced emotion detection for words, but not for facial expressions. We also measured activity of the corrugator and zygomaticus muscle to assess facial simulation, as processing of emotional stimuli can be facilitated by facial simulation. While facial simulation emerged for facial expressions, the level of contextual supporting information did not qualify this effect. All in all, our findings suggest that adding emotional-relevant voice elements positively influence emotion detection.· · · ·info:eu-repo/semantics/publishedVersio

    The lasting smell of emotions: The effects of reutilizing fear sweat samples

    Get PDF
    A growing body of research has shown that human apocrine sweat carries information about the emotional state of its donor. Exposure to sweat produced in a fear-inducing context triggers in its receivers a simulacrum of this emotional state, as evidenced by increased medial frontalis and corrugator supercilii (facial electromyography; fEMG) activity - two facial muscles involved in the display of fear facial expressions. However, despite the increased interest in the effects of emotional sweat, little is known about the properties of these chemical sweat samples. The goal of this study was to examine whether a second application of the same sweat sample would yield reliable results. Specifically, we assessed whether sweat samples collected from Portuguese males (N = 8) in fear (vs. neutral)-inducing contexts would produce similar fEMG activations (i.e., in the medial frontalis and corrugator supercilii) in female receivers (N = 60) across two independent applications (the first with Dutch and the second with Portuguese receivers). Our findings showed that exposure to fear (vs. neutral) sweat resulted in higher activation of both muscles compared with neutral odors, revealing a similar data pattern across the two applications and underlining the feasibility of reusing emotional sweat samples. The implications of these findings for properties of these sweat volatiles are discussed.info:eu-repo/semantics/publishedVersio

    Social inferences from faces as a function of the left-to-right movement continuum

    Get PDF
    We examined whether reading and writing habits known to drive agency perception also shape the attribution of other agency-related traits, particularly for faces oriented congruently with script direction (i.e., left-to-right). Participants rated front-oriented, left-oriented and right-oriented faces on 14 dimensions. These ratings were first reduced to two dimensions, which were further confirmed with a new sample: power and social-warmth. Both dimensions were systematically affected by head orientation. Right-oriented faces generated a stronger endorsement of the power dimension (e.g., agency, dominance), and, to a lesser extent, of the social-warmth dimension, relative to the left and frontal-oriented faces. A further interaction between the head orientation of the faces and their gender revealed that front-facing females, relative to front-facing males, were attributed higher social-warmth scores, or communal traits (e.g., valence, warmth). These results carry implications for the representation of people in space particularly in marketing and political contexts. Face stimuli and respective norming data are available at www.osf.io/v5jpd.FCT-Fundação para a Ciência e Tecnologiainfo:eu-repo/semantics/publishedVersio

    Asymmetric practices of reading and writing shape visuospatial attention and discrimination

    Get PDF
    Movement is generally conceived of as unfolding laterally in the writing direction that one is socialized into. In 'Western' languages, this is a left-to-right bias contributing to an imbalance in how attention is distributed across space. We propose that the rightward attentional bias exercises an additional unidirectional influence on discrimination performance thus shaping the congruency effect typically observed in Posner-inspired cueing tasks. In two studies, we test whether faces averted laterally serve as attention orienting cues and generate differences in both target discrimination latencies and gaze movements across left and right hemifields. Results systematically show that right-facing faces (i.e. aligned with the script direction) give rise to an advantage for cue-target pairs pertaining to the right (versus left) side of space. We report an asymmetry between congruent conditions in the form of right-sided facilitation for: (a) response time in discrimination decisions (experiment 1-2) and (b) eye-gaze movements, namely earlier onset to first fixation in the respective region of interest (experiment 2). Left and front facing cues generated virtually equal exploration patterns, confirming that the latter did not prime any directionality. These findings demonstrate that visuospatial attention and consequent discrimination are highly dependent on the asymmetric practices of reading and writing.Fundação para a Ciência e Tecnologia - FCTinfo:eu-repo/semantics/publishedVersio

    Not all emotions are equal:Fear chemosignals lower awareness thresholds only for fearful faces

    Get PDF
    Exposure to body odors (chemosignals) collected under different emotional states (i.e., emotional chemosignals) can modulate our visual system, biasing visual perception. Recent research has suggested that exposure to fear body odors, results in a generalized faster access to visual awareness of different emotional facial expressions (i.e., fear, happy, and neutral). In the present study, we aimed at replicating and extending these findings by exploring if these effects are limited to fear odor, by introducing a second negative body odor - i.e., disgust. We compared the time that three different emotional facial expressions (i.e., fear, disgust, and neutral) took to reach visual awareness, during a breaking continuous flash suppression paradigm, across three body odor conditions (i.e., fear, disgust and neutral). We found that fear body odors do not trigger an overall faster access to visual awareness, but instead sped-up access to awareness specifically for facial expressions of fear. Disgust odor, on the other hand, had no effects on awareness thresholds of facial expressions. These findings contrast with prior results, suggesting that the potential of fear body odors to induce visual processing adjustments is specific to fear cues. Furthermore, our results support a unique ability of fear body odors in inducing such visual processing changes, compared to other negative emotional chemosignals (i.e., disgust). These conclusions raise interesting questions as to how fear odor might interact with the visual processing stream, whilst simultaneously giving rise to future avenues of research.info:eu-repo/semantics/acceptedVersio

    Can Humans Discriminate Horse ‘Fear’ Chemosignals from Control Chemosignals? Comment on Sabiniewicz et al. A Preliminary Investigation of Interspecific Chemosensory Communication of Emotions: Can Humans (Homo sapiens) Recognise Fear- and Non-Fear Body Odour from Horses (Equus ferus caballus). Animals 2021, 11, 3499

    Get PDF
    We illustrate the problematic nature of different assumptions guiding the examination of whether humans can detect the source of fear chemosignals (i.e., body odors) emitted by horses—a research question examined in an article recently published in Animals. A central issue is that the formulation of the question itself contains the answer to it. In this paper, we parse the problematic assumptions on which the analysis and methodology rely, leading to conclusions that are difficult to support. These assumptions constitute examples of methodological problems that should be avoided in research with animals and odors. The unique aspect of this paper is that it is a collaborative product, including the original contributor, in the pursuit of transparency in science.info:eu-repo/semantics/publishedVersio

    The spatial grounding of politics

    Get PDF
    In three studies, we advance the research on the association between abstract concepts and spatial dimensions by examining the spatial anchoring of political categories in three different paradigms (spatial placement, memory, and classification) and using non-linguistic stimuli (i.e., photos of politicians). The general hypothesis that politicians of a conservative or socialist party are grounded spatially was confirmed across the studies. In Study 1, photos of politicians were spontaneously placed to the left or right of an unanchored horizontal line depending on their socialist-conservative party affiliation. In Study 2, the political orientation of members of parliament systematically distorted the recall of the spatial positions in which they were originally presented. Finally, Study 3 revealed that classification was more accurate and faster when the politicians were presented in spatially congruent positions (e.g., socialist politician presented on the left side of the monitor) rather than incongruent ones (e.g., socialist on the right side). Additionally, we examined whether participants’ political orientation and awareness moderated these effects and showed that spatial anchoring seems independent of political preference but increases with political awareness.Fundação para a Ciência e Tecnologia - FCTinfo:eu-repo/semantics/publishedVersio

    Facial emotion detection in Vestibular Schwannoma patients with and without facial paresis

    Get PDF
    This study investigates whether there exist differences in facial emotion detection accuracy in patients suffering from Vestibular Schwannoma (VS) due to their facial paresis. Forty-four VS patients, half of them with, and half of them without a facial paresis, had to classify pictures of facial expressions as being emotional or non-emotional. The visual information of images was systematically manipulated by adding different levels of visual noise. The study had a mixed design with emotional expression (happy vs. angry) and visual noise level (10% to 80%) as repeated measures and facial paresis (present vs. absent) and degree of facial dysfunction as between subjects' factors. Emotion detection accuracy declined when visual information declined, an effect that was stronger for anger than for happy expressions. Overall, emotion detection accuracy for happy and angry faces did not differ between VS patients with or without a facial paresis, although exploratory analyses suggest that the ability to recognize emotions in angry facial expressions was slightly more impaired in patients with facial paresis. The findings are discussed in the context of the effects of facial paresis on emotion detection, and the role of facial mimicry, in particular, as an important mechanism for facial emotion processing and understanding.info:eu-repo/semantics/publishedVersio
    corecore