757 research outputs found

    Variation in normal mood state influences sensitivity to dynamic changes in emotional expression

    Get PDF
    Acknowlegements We would like to thank Dr Douglas Martin for providing useful comments on an earlier draft. Thanks also to Kostadin Karavasilev for help with some of the data collection.Peer reviewedPostprin

    Eye gaze influences working memory for happy but not angry faces

    Get PDF
    Peer reviewedPostprin

    Angry expressions strengthen the encoding and maintenance of face identity representations in visual working memory

    Get PDF
    This work was funded by a BBSRC grant (BB/G021538/2) to all authors.Peer reviewedPreprin

    Increased perceptual distraction and task demand enhances gaze and non-biological cuing effects.

    Get PDF
    This study aims to improve understanding of how distracting information and target task demands influence the strength of gaze and non-biological (arrow and moving line) cuing effects. Using known non-predictive central cues, we manipulated the degree of distraction from additional information presented on the other side of the target, and target task difficulty. In Experiment 1, we used the traditional unilateral cuing task, where participants state the location of an asterisk and the non-target location is empty (no distraction). Experiment 2 comprised a harder localisation task (which side contains an embedded oddball item) and presented distracting target-related information on the other side. In Experiment 3, we used a discrimination task (upright or inverted embedded T) with distracter information that was unrelated or related to the target (low vs. high distraction, respectively). We found that the magnitude of cuing scaled with the degree of combined distraction and task demands, increasing up to six-fold from Experiments 1 and 2 to the high-distraction condition in Experiment 3. Thus, depleting attentional resources in this manner appears to weaken the ability to ignore uninformative directional cues. Findings are discussed within the framework of a resource-limited account of cue inhibition

    Barriers block the effect of joint attention on working memory:Perspective taking matters

    Get PDF
    Joint focus of attention between two individuals can influence the way that observers attend, encode, and value items. Using a nonpredictive gaze cuing task we previously found that working memory (WM) was better for jointly attended (validly cued) versus invalidly cued colored squares. Here we examine whether this influence of gaze on WM is driven by observers sharing the perspective of the face cue (mental state account), or simply by increased attention to the cued location (social attention account). To manipulate perspective taking, a closed barrier obstructed the cue faceā€™s view of the memoranda, while an open barrier allowed the cue face to ā€œseeā€ the colors. A central cue face flanked by two identical barriers looked left or right, followed 500 ms later by colored squares for encoding which appeared equally often in the validly and invalidly cued locations. After a blank 1000 ms maintenance interval, participants stated whether a probe color was present or not in the preceding display. When the barrier was open, WM was significantly impaired for invalidly versus validly cued items. When the barrier was closed, the effect of gaze cues on WM was abolished. In contrast, further experiments showed a significant cuing effect on the speed of simple target localization and color discrimination regardless of barrier type. These findings support the mental state account of joint attention in WM, whereby the attentional focus of another alters WM via higher level engagement with the second person perspective. A goal-specific model of perspective taking is proposed

    Joint attention enhances visual working memory.

    Get PDF
    Joint attentionā€”the mutual focus of 2 individuals on an itemā€”speeds detection and discrimination of target information. However, what happens to that information beyond the initial perceptual episode? To fully comprehend and engage with our immediate environment also requires working memory (WM), which integrates information from second to second to create a coherent and fluid picture of our world. Yet, no research exists at present that examines how joint attention directly impacts WM. To investigate this, we created a unique paradigm that combines gaze cues with a traditional visual WM task. A central, direct gaze ā€˜cueā€™ face looked left or right, followed 500 ms later by 4, 6, or 8 colored squares presented on one side of the face for encoding. Crucially, the cue face either looked at the squares (valid cue) or looked away from them (invalid cue). A no shift (direct gaze) condition served as a baseline. After a blank 1,000 ms maintenance interval, participants stated whether a single test square color was present or not in the preceding display. WM accuracy was significantly greater for colors encoded in the valid versus invalid and direct conditions. Further experiments showed that an arrow cue and a low-level motion cueā€”both shown to reliably orient attentionā€”did not reliably modulate WM, indicating that social cues are more powerful. This study provides the first direct evidence that sharing the focus of another individual establishes a point of reference from which information is advantageously encoded into WM

    Competition between emotional faces in visuospatial working memory.

    Get PDF
    Visuospatial working memory (VSWM) helps track the identity and location of people during social interactions. Previous work showed better VSWM when all faces at encoding displayed a happy compared to an angry expression, reflecting a prosocial preference for monitoring who was where. However, social environments are not typically uniform, and certain expressions may more strongly compete for and bias face monitoring according to valence and/or arousal properties. Here, we used heterogeneous encoding displays in which two faces shared one emotion and two shared another, and asked participants to relocate a central neutral probe face after a blank delay. When considering the emotion of the probed face independently of the co-occurring emotion at encoding, an overall happy benefit was replicated. However, accuracy was modulated by the nonprobed emotion, with a relocation benefit for angry over sad, happy over fearful, and sad over happy faces. These effects did not depend on encoding fixation time, stimulus arousal, perceptual similarity, or response bias. Thus, emotional competition for faces in VSWM is complex and appears to rely on more than simple arousal- or valence-biased mechanisms. We propose a ā€œsocial value (SV)ā€ account to better explain when and why certain emotions may be prioritized in VSWM

    Task cues lead to item-level backward inhibition with univalent stimuli and responses

    Get PDF
    Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.Peer reviewedPostprin
    • ā€¦
    corecore