53 research outputs found

    Estimates (<i>β</i> values, in seconds), standard errors, and <i>t</i> values for the fixed-effect predictors in the analyses for the face and body blocks.

    No full text
    <p>The <i>p</i> values in the analyses were obtained by the lsmeans and pbkrtest toolboxes, thus were reported separately in the text.</p><p>Estimates (<i>β</i> values, in seconds), standard errors, and <i>t</i> values for the fixed-effect predictors in the analyses for the face and body blocks.</p

    Suppression time for neutral, fearful, and angry facial and bodily expressions after correcting for low-level features unrelated to emotion.

    No full text
    <p>Error bars represent standard errors. *: <i>p</i> < .05; **: <i>p</i> < .01; ***: <i>p</i> < .001.</p

    Examples of stimuli and the CFS-b procedure.

    No full text
    <p>(<b>A)</b> Examples of face and body stimuli, with neutral, fearful and angry emotional facial and bodily expressions were used. (<b>B)</b> The CFS-b procedure.</p

    Parsing Science - Emotions and Rubber Hand Illusion

    No full text
    Sometimes our emotions and the power of illusions can put our sense of reality to the test. In this special Halloween episode, <a href="http://www.beatricedegelder.com/" rel="noopener" target="_blank">Beatrice de Gelder</a> from <a href="https://www.maastrichtuniversity.nl/" rel="noopener" target="_blank">Maastricht University</a> in The Netherlands shares stories behind her study "<a href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0186009" rel="noopener" target="_blank">Affective vocalizations influence body ownership as measured in the rubber hand illusion</a>," which she coauthored with Tahnée Engelen, Rebecca Watson, and Francesco Pavani

    Results of the face and body blocks.

    No full text
    <p>Suppression time for neutral, fearful, and angry facial and bodily expressions. Error bars represent standard errors. *: p < .05; ***: p < .001.</p

    Data_Sheet_1_Modality-specific brain representations during automatic processing of face, voice and body expressions.docx

    No full text
    A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.</p

    Results of experiment 1.

    No full text
    <p><i>Left</i>: Mean categorization performance plotted as function of SOA latency corrected for chance (50 percent). <i>Right</i>: Mean confidence ratings plotted as function of SOA latency. Error bars represent standard error of the mean. SOA = Stimulus Onset Asynchrony, TO = Target Only.</p

    Illustration of an example trial and example stimuli (experiment 1).

    No full text
    <p>An example trial (<i>left</i>), an example of an angry and happy bodily posture (<i>upper right</i>), the mask (<i>below right</i>).</p

    Affective vocalizations influence body ownership as measured in the rubber hand illusion

    No full text
    <div><p>Emotional signals, like threatening sounds, automatically ready the perceiver to prepare an appropriate defense behavior. Conjecturing that this would manifest itself in extending the safety zone around the body we used the rubber hand illusion (RHI) to test this prediction. The RHI is a perceptual illusion in which body ownership is manipulated by synchronously stroking a rubber hand and real hand occluded from view. Many factors, both internal and external, have been shown to influence the strength of the illusion, yet the effect of emotion perception on body ownership remains unexplored. We predicted that listening to affective vocalizations would influence how strongly participants experience the RHI. In the first experiment four groups were tested that listened either to affective sounds (angry or happy vocalizations), non-vocal sounds or no sound while undergoing synchronous or asynchronous stroking of the real and rubber hand. In a second experiment three groups were tested comparing angry or neutral vocalizations and no sound condition. There was a significantly larger drift towards the rubber hand in the emotion versus the no emotion conditions. We interpret these results in the framework that the spatial increase in the RHI indicates that under threat the body has the capacity to extend its safety zone.</p></div
    • …
    corecore