6,157 research outputs found

    Exploring the Impact of Affective Processing on Visual Perception of Large-Scale Spatial Environments

    Get PDF
    This thesis explores the interaction between emotions and visual perception using large scale spatial environment as the medium of this interaction. Emotion has been documented to have an early effect on scene perception (Olofsson, Nordin, Sequeira, & Polich, 2008). Yet, most popularly-used scene stimuli, such as the IAPS or GAPED stimulus sets often depict salient objects embedded in naturalistic backgrounds, or “events” which contain rich social information, such as human faces or bodies. And thus, while previous studies are instrumental to our understanding of the role that social-emotion plays in visual perception, they do not isolate the effect of emotion from the social effects in order to address the specific role that emotion plays in scene recognition – defined here as the recognition of large-scale spatial environments. To address this question, we examined how early emotional valence and arousal impact scene processing, by conducting an Event-Related Potential (ERP) study using a well-controlled set of scene stimuli that reduced the social factor, by focusing on natural scenes which did not contain human faces or actors. The study comprised of two stages. First, we collected affective ratings of 440 natural scene images selected specifically so they will not contain human faces or bodies. Based on these ratings, we divided our scene stimuli into three distinct categories: pleasant, unpleasant, and neutral. In the second stage, we recorded ERPs from a separate group of participants as they viewed a subset of 270 scenes ranked highest in each of their respective categories. Scenes were presented for 200ms, back-masked using white noise, while participants performed an orthogonal fixation task. We found that emotional valence had significant impact on scene perception in which unpleasant scenes had higher P1, N1 and P2 peaks. However, we studied the relative contribution of emotional effect and low-level visual features using dominance analysis which can compare the relative importance of predictors in multiple regression. We found that the relative contribution of emotional effect and low-level visual features (operationalized by the GIST model, (Oliva & Torralba, 2006)) had complete dominance over emotional effects (both valence and arousal) on most early peaks and areas under the curve (AUC). We also found out that affective ratings were significantly influenced by the GIST intensities of the scenes in which scenes with high GIST intensities were more likely to be rated as unpleasant. We concluded that emotional impact in our stimulus set of natural scenes was mostly due to bottom-up effect on scene perception and that controlling for the low-level visual features (particularly the GIST intensity) would be an important step to confirm the affective impact on scene perception

    Looking Beyond a Clever Narrative: Visual Context and Attention are Primary Drivers of Affect in Video Advertisements

    Full text link
    Emotion evoked by an advertisement plays a key role in influencing brand recall and eventual consumer choices. Automatic ad affect recognition has several useful applications. However, the use of content-based feature representations does not give insights into how affect is modulated by aspects such as the ad scene setting, salient object attributes and their interactions. Neither do such approaches inform us on how humans prioritize visual information for ad understanding. Our work addresses these lacunae by decomposing video content into detected objects, coarse scene structure, object statistics and actively attended objects identified via eye-gaze. We measure the importance of each of these information channels by systematically incorporating related information into ad affect prediction models. Contrary to the popular notion that ad affect hinges on the narrative and the clever use of linguistic and social cues, we find that actively attended objects and the coarse scene structure better encode affective information as compared to individual scene objects or conspicuous background elements.Comment: Accepted for publication in the Proceedings of 20th ACM International Conference on Multimodal Interaction, Boulder, CO, US

    Recognition of facial expressions is influenced by emotional scene gist

    Full text link

    Social context influences recognition of bodily expressions

    Get PDF
    Previous studies have shown that recognition of facial expressions is influenced by the affective information provided by the surrounding scene. The goal of this study was to investigate whether similar effects could be obtained for bodily expressions. Images of emotional body postures were briefly presented as part of social scenes showing either neutral or emotional group actions. In Experiment 1, fearful and happy bodies were presented in fearful, happy, neutral and scrambled contexts. In Experiment 2, we compared happy with angry body expressions. In Experiment 3 and 4, we blurred the facial expressions of all people in the scene. This way, we were able to ascribe possible scene effects to the presence of body expressions visible in the scene and we were able to measure the contribution of facial expressions to the body expression recognition. In all experiments, we observed an effect of social scene context. Bodily expressions were better recognized when the actions in the scenes expressed an emotion congruent with the bodily expression of the target figure. The specific influence of facial expressions in the scene was dependent on the emotional expression but did not necessarily increase the congruency effect. Taken together, the results show that the social context influences our recognition of a person’s bodily expression

    A Review on EEG Signals Based Emotion Recognition

    Get PDF
    Emotion recognition has become a very controversial issue in brain-computer interfaces (BCIs). Moreover, numerous studies have been conducted in order to recognize emotions. Also, there are several important definitions and theories about human emotions. In this paper we try to cover important topics related to the field of emotion recognition. We review several studies which are based on analyzing electroencephalogram (EEG) signals as a biological marker in emotion changes. Considering low cost, good time and spatial resolution, EEG has become very common and is widely used in most BCI applications and studies. First, we state some theories and basic definitions related to emotions. Then some important steps of an emotion recognition system like different kinds of biologic measurements (EEG, electrocardiogram [EEG], respiration rate, etc), offline vs online recognition methods, emotion stimulation types and common emotion models are described. Finally, the recent and most important studies are reviewed

    The role of contexts in face processing:Behavioral and ERP studies

    Get PDF

    Seeing the bigger picture: visual imagination and the social brain

    Get PDF
    I studied multi-modal aspects of visual imagination in relation to visual art and complex images, defining ‘visual imagination’ broadly as a dynamic of complex psychological processes that integrate visual information with prior experiences and knowledge to construct internal models of oneself, others and the outside world. This reflects the ultimate aim of my work to develop engaging cultural and clinical resources that strengthen social brain networks, tailored to personal interests, age and cognitive health. I pursued two interrelated research programmes based primarily at the Wellcome Collection, as part of my interdisciplinary residency with Created Out of Mind. I used complementary neuroscientific and visual research methods to probe relationships between visual imagination and the social brain in neurologically healthy adults and people living with various forms of dementia. The Social Brain Atlas and connectome (Alcalá López et al., Cerebral Cortex 2017) was recently computed from 3972 functional neuroimaging studies in 22712 healthy adults: to contextualise my research in the social brain, I first translated the social brain connectome to functional infographics (relational spatial representations) of the four hierarchical processing levels of the Social Brain Atlas, and generated visual imagination brain profiles in healthy adults and profiles of canonical dementia syndromes. I used these to generate hypotheses and guide analysis of my neuroscientific experiments. I recruited three participant cohorts: 17 neurologically healthy adults aged 20-30 years; 20 neurologically healthy adults aged 50+ years; and 11 senior adults living with various forms of dementia. These research participants took part in five neuroscientific experiments that I had designed, in which I used advanced technologies to capture physiological responses and established as well as novel visual research methods to study neuropsychological responses to visual art, complex imagery and colour experiences. I employed an arts-based facilitated conversation methodology, Visual Thinking Strategies (VTS); and I developed novel quantitative methods to analyse recorded eye tracking data, electrodermal activity and speech samples. I used both parametric and non-parametric statistical methods to compare participant cohorts. In parallel with the neuroscientific research, I developed a series of art experiments at UCL Institute of Making, and my studio at the Limehouse Art Foundation, East London. My artistic research complemented my neuroscientific work by emphasising individual experience over generic perceptual mechanisms: by creating space for personal interactions with art, the research becomes contextualised in the social world. The artistic research resulted in a public exhibition of optical instruments, visual artworks and installations that expanded on the two neuroscientific research projects, complementing the written thesis with the embodied language of visual art. Visitors could freely explore the perceptual effects of the optical instruments and were invited to reflect on the visual artworks with the Visual Thinking Strategies method
    corecore