89 research outputs found

    Social transmission of leadership preference:knowledge of group membership and partisan media reporting moderates perceptions of leadership ability from facial cues to competence and dominance

    Get PDF
    While first impressions of dominance and competence can influence leadership preference, social transmission of leadership preference has received little attention. The capacity to transmit, store and compute information has increased greatly over recent history, and the new media environment may encourage partisanship (i.e. ‘echo chambers’), misinformation and rumour spreading to support political and social causes and be conducive both to emotive writing and emotional contagion, which may shape voting behaviour. In our pre-registered experiment, we examined whether implicit associations between facial cues to dominance and competence (intelligence) and leadership ability are strengthened by partisan media and knowledge that leaders support or oppose us on a socio-political issue of personal importance. Social information, in general, reduced well-established implicit associations between facial cues and leadership ability. However, as predicted, social knowledge of group membership reduced preferences for facial cues to high dominance and intelligence in out-group leaders. In the opposite-direction to our original prediction, this ‘in-group bias’ was greater under less partisan versus partisan media, with partisan writing eliciting greater state anxiety across the sample. Partisanship also altered the salience of women’s facial appearance (i.e., cues to high dominance and intelligence) in out-group versus in-group leaders. Independent of the media environment, men and women displayed an in-group bias toward facial cues of dominance in same-sex leaders. Our findings reveal effects of minimal social information (facial appearance, group membership, media reporting) on leadership judgements, which may have implications for patterns of voting or socio-political behaviour at the local or national level

    Own attractiveness and perceived relationship quality shape sensitivity in women’s memory for other men on the attractiveness dimension

    Get PDF
    Although recent work suggests that opposite-sex facial attractiveness is less salient in memory when individuals are in a committed romantic relationship, romantic relationship quality can vary over time. In light of this, we tested whether activating concerns about romantic relationship quality strengthens memory for attractive faces. Partnered women were exposed briefly to faces manipulated in shape cues to attractiveness before either being asked to think about a moment of emotional closeness or distance in their current relationship. We measured sensitivity in memory for faces as the extent to which they recognized correct versions of studied faces over versions of the same person altered to look either more or less-attractive than their original (i.e. studied) version. Contrary to predictions, high relationship quality strengthened hit rate for faces regardless of the sex or attractiveness of the face. In general, women’s memories were more sensitive to attractiveness in women, but were biased toward attractiveness in male faces, both when responding to unfamiliar faces and versions of familiar faces that were more attractive than the original male identity from the learning phase. However, findings varied according to self-rated attractiveness and a psychometric measure of the quality of their current relationship. Attractive women were more sensitive to attractiveness in men, while their less-attractive peers had a stronger bias to remember women as more-attractive and men as less-attractive than their original image respectively. Women in better-quality romantic relationships had stronger positive biases toward, and false memories for, attractive men. Our findings suggest a sophisticated pattern of sensitivity and bias in women’s memory for facial cues to quality that varies systematically according to factors that may alter the costs of female mating competition (‘market demand’) and relationship maintenance

    Emotion-color associations in the context of the face

    Get PDF
    Facial expressions of emotion contain important information that is perceived and used by observers to understand others’ emotional state. While there has been considerable research into perceptions of facial musculature and emotion, less work has been conducted to understand perceptions of facial coloration and emotion. The current research examined emotion-color associations in the context of the face. Across four experiments, participants were asked to manipulate the color of face, or shape, stimuli along two color axes (i.e., red-green, yellow-blue) for six target emotions (i.e., anger, disgust, fear, happiness, sadness, surprise). The results yielded a pattern that is consistent with physiological and psychological models of emotion.PostprintPeer reviewe

    Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions

    Get PDF
    & Processing of complex visual stimuli comprising facial movements, hand actions, and body movements is known to occur in the superior temporal sulcus (STS) of humans and nonhuman primates. The STS is also thought to play a role in the integration of multimodal sensory input. We investigated whether STS neurons coding the sight of actions also integrated the sound of those actions. For 23 % of neurons responsive to the sight of an action, the sound of that action significantly modulated the visual response. The sound of the action increased or decreased the visually evoked response for an equal number of neurons. In the neurons whose visual response was increased by the addition of sound (but not those neurons whose responses were decreased), the audiovisual integration was dependent upon the sound of the action matching the sight of the action. These results suggest that neurons in the STS form multisensory representations of observed actions. &amp
    • …
    corecore