20 research outputs found

    How Do I Know Where I Stand? Determinants of Social Standing across Two Cultural Contexts

    No full text
    Poster for SPSP201

    Not all laughs are the same: Tickling Induces a Unique Type of Spontaneous Laughter

    No full text
    Laughter elicited by tickling is acoustically and perceptually distinct from other laughs

    Human Listeners’ Recognition of Vocal Production Context and Core Affect Dimensions in Chimpanzee Vocalizations

    No full text
    Preregistration document for the project "Human Listeners’ Recognition of Vocal Production Context and Core Affect Dimensions in Chimpanzee Vocalizations

    Posed and spontaneous nonverbal vocalizations of positive emotions: Acoustic analysis and perceptual judgments

    No full text
    When experiencing different positive emotional states, like amusement or relief, we may produce nonverbal vocalizations such as laughs and sighs. In the current study, we describe the acoustic structure of posed and spontaneous nonverbal vocalizations of 14 different positive emotions, and test whether listeners (N = 201) map the vocalizations to emotions. The results show that vocalizations of 13 different positive emotions were recognized at better-than-chance levels, but not vocalizations of being moved. Emotions varied in whether vocalizations were better recognized from spontaneous or posed expressions

    Sounds like a fight: Listeners can infer behavioural contexts from spontaneous nonverbal vocalisations

    No full text
    When we hear another person laugh or scream, can we tell the kind of situation they are in – for example, whether they are playing or fighting? Nonverbal expressions are theorised to vary systematically across behavioural contexts. Perceivers might therefore be sensitive to these systematic mappings and consequently able to infer contexts from others’ vocalisations. Here, in two pre-registered experiments, we test the prediction that listeners can accurately deduce production contexts (e.g., being tickled, discovering threat) from spontaneous nonverbal vocalisations. In Experiment 1, listeners (total n = 3120) matched 200 nonverbal vocalisations to one of 10 contexts using yes/no response options. Using signal detection analysis, we show that listeners were accurate at matching vocalisations to nine of the behavioural contexts. In Experiment 2, listeners (n = 337) categorised the production contexts by selecting from 10 response options in a forced-choice task. By analysing unbiased hit rates, we show that participants categorised all 10 contexts at better-than-chance levels. Together, these results demonstrate that perceivers can accurately infer contexts from nonverbal vocalisations. Our findings strengthen the view that observers interpret nonverbal expressions as responses to particular types of situations, suggesting form-function relations in human nonverbal vocalisations that are detectable by listeners

    Behaviour in the wild: A computational ethology approach to the study of human behaviour

    No full text
    The complexity of human behaviour presents a challenge for behavioural scientists. We propose that scientific understanding of human behaviour can be accelerated by large-scale computational studies of behaviour in everyday life. Specifically, we propose taking a computational ethology approach. Ethology pioneered the systematic study of behaviour in real-world settings in the early 20th century; recent advances in computational methods allow for such studies to be scaled up. Computational ethology uses video and audio recordings to establish how, when, and why specific behaviours are elicited by ecologically relevant stimuli. We highlight advantages and drawbacks of computational ethology for uncovering new phenomena in the study of human behaviour. Finally, we note the potential for computational ethology to yield insights relevant to behavioural pathologies. In sum, we argue that the synthesis of ethology and computational methods offers unique potential for generating insights into the complexities of human behaviour in the wild

    Human Body Odors as Emotion Elicitors

    No full text
    This research is designed to uncover if human odors produced under emotional conditions prime an undifferentiated evaluative process or a core emotion using a novel facial expression of emotion recognition paradigm. All pre-registrations will be uploaded on the Open Science Framework

    Emotions Across Cultures

    No full text
    What can evolutionary theories tell us about emotions, and how can research on emotions inform evolutionary theories? In this chapter, we discuss links between evolutionary theories of emotion and the cross-cultural study of emotion. We examine what predictions can be derived from evolutionary theories about cross-cultural consistency and variability. In particular, we emphasise the notion that evolved psychological mechanisms result in cultural differences instantiated as variations on common themes of human universals. We focus on two components of emotions: emotion experience and nonverbal expressions. Several case studies from emotion science are outlined to illustrate this framework empirically. In the domain of emotion experience, we highlight shame as an illustration of the idea of variations occurring across cultures around a common theme. In the domain of nonverbal expression of emotion, this idea is illustrated by the in-group advantage, that is, superior recognition of emotional expressions produced by members of one's own group. We consider both statistical learning and motivational explanations for this phenomenon in light of evolutionary perspectives. Lastly, we review three different theoretical accounts of how to conceptualise cross-culturally shared themes underlying emotions. We conclude that the cross-cultural study of consistency and variation in different emotion components offers a valuable opportunity for testing predictions derived from evolutionary psychology

    What Makes Us Feel Good? A Data-driven Investigation of Positive Emotion Experience

    No full text
    Is feeling grateful a different kind of experience than feelings of other positive emotions like pride, awe, or love? Here, we use semantic space theory to test which positive emotional experiences are distinct from each other based on in-depth personal narratives of experiences involving 22 positive emotions (n = 165; 3,592 emotional events). A bottom-up computational analysis was applied to the transcribed text; unsupervised clustering was employed to maximise internal granular consistency (i.e., the clusters being as different as possible from each other while internally as homogenous as possible). The analysis yielded distinct positive emotion experiences, characterised by admiration, amusement, being moved, feeling respected, excitement, hope, interest, lust, positive surprise, pride, sensory pleasure, and tenderness. Applying bottom-up language analysis techniques to rich accounts of emotional experiences reveals that there are at least 12 unique dimensions of positive emotion experience in daily life

    Fear odor facilitates the detection of fear expressions over other negative expressions

    No full text
    In a double-blind experiment, participants were exposed to facial images of anger, disgust, fear, and neutral expressions under 2 body odor conditions: fear and neutral sweat. They had to indicate the valence of the gradually emerging facial image. Two alternative hypotheses were tested, namely a "general negative evaluative state" hypothesis and a "discrete emotion" hypothesis. These hypotheses suggest 2 distinctive data patterns for muscle activation and classification speed of facial expressions. The pattern of results that would support a "discrete emotions perspective" would be expected to reveal significantly increased activity in the medial frontalis (eyebrow raiser) and corrugator supercilii (frown) muscles associated with fear, and significantly decreased reaction times (RTs) to "only" fear faces in the fear odor condition. Conversely, a pattern of results characterized by only a significantly increased corrugator supercilii activity together with decreased RTs for fear, disgust, and anger faces in the fear odor condition would support an interpretation in line with a general negative evaluative state perspective. The data support the discrete emotion account for facial affect perception primed with fear odor. This study provides a first demonstration of perception of discrete negative facial expressions using olfactory priming
    corecore