42 research outputs found

    Negative emotional stimuli reduce contextual cueing but not response times in inefficient search

    Get PDF
    In visual search, previous work has shown that negative stimuli narrow the focus of attention and speed reaction times (RTs). This paper investigates these two effects by first asking whether negative emotional stimuli narrow the focus of attention to reduce the learning of a display context in a contextual cueing task and, second, whether exposure to negative stimuli also reduces RTs in inefficient search tasks. In Experiment 1, participants viewed either negative or neutral images (faces or scenes) prior to a contextual cueing task. In a typical contextual cueing experiment, RTs are reduced if displays are repeated across the experiment compared with novel displays that are not repeated. The results showed that a smaller contextual cueing effect was obtained after participants viewed negative stimuli than when they viewed neutral stimuli. However, in contrast to previous work, overall search RTs were not faster after viewing negative stimuli (Experiments 2 to 4). The findings are discussed in terms of the impact of emotional content on visual processing and the ability to use scene context to help facilitate search

    Predicting the valence of a scene from observers’ eye movements

    Get PDF
    Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images

    Listening to music reduces eye movements

    Get PDF
    Listening to music can change the way that people visually experience the environment, probably as a result of an inwardly directed shift of attention. We investigated whether this attentional shift can be demonstrated by reduced eye movement activity, and if so, whether that reduction depends on absorption. Participants listened to their preferred music, to unknown neutral music, or to no music while viewing a visual stimulus (a picture or a film clip). Preference and absorption were significantly higher for the preferred music than for the unknown music. Participants exhibited longer fixations, fewer saccades, and more blinks when they listened to music than when they sat in silence. However, no differences emerged between the preferred music condition and the neutral music condition. Thus, music significantly reduces eye movement activity, but an attentional shift from the outer to the inner world (i.e., to the emotions and memories evoked by the music) emerged as only one potential explanation. Other explanations, such as a shift of attention from visual to auditory input, are discussed

    The Neural Mechanism of Working Memory Training Improving Emotion Regulation

    No full text
    Thirty-six patients with high anxiety were recruited. The subjects were divided into working memory training group and control group in a voluntary and random manner, with 18 individuals in each group. The training group was trained for 21 days of working memory, and the control group was not trained for working memory. The subjective emotion ratings and the ERP indicator late positive potential (LPP) of the two groups of participants were recorded, under three experimental conditions (watching negative images, cognitive reappraisal, attentional distraction). It was found that the LPP amplitude reduction was significantly higher for training group than control group specifically in the condition of cognitive reappraisal. This study showed that working memory training can improve the ability cognitive reappraisal and can be a potential intervention for promoting the emotional regulation of individuals with high trait anxiety.</p

    From distraction to mindfulness : Psychological and neural mechanisms of attention strategies in self-regulation

    No full text
    The current chapter examines attention strategies that may facilitate self-regulation. In particular, we focus on the attention strategies of distraction and mindfulness. By distraction, we mean shifting attention from the original object of attention onto a different focal object. Mindfulness, on the other hand, implies regulating the focus and the quality of one’s attention. This implies paying attention to the focal object, but at the same time observing one’s own thoughts and experiences and seeing them as mere mental events. We discuss evidence that distraction and mindfulness modulate the impact of affective information on thoughts, feelings, and behavior. Whereas the two strategies are seemingly opposing in nature, we have found that both distraction and mindfulness can undermine intrusive thinking patterns in response to affective information that normally result in more impulsive behavior. We show how the effectiveness of these strategies is reflected not only in behavioral measures of self-regulation success but in neurophysiological indices as well. Distraction seems to involve the increased engagement of prefrontal brain regions for task-related processing, whereas mindfulness training may affect the connectivity between control and affective brain regions. More broadly speaking, the present chapter shows that combining behavioral and neuroscience measures can be a particularly fruitful approach in understanding how attention strategies impact self-regulation
    corecore