48,212 research outputs found

    Affect Recognition in Ads with Application to Computational Advertising

    Get PDF
    Advertisements (ads) often include strongly emotional content to leave a lasting impression on the viewer. This work (i) compiles an affective ad dataset capable of evoking coherent emotions across users, as determined from the affective opinions of five experts and 14 annotators; (ii) explores the efficacy of convolutional neural network (CNN) features for encoding emotions, and observes that CNN features outperform low-level audio-visual emotion descriptors upon extensive experimentation; and (iii) demonstrates how enhanced affect prediction facilitates computational advertising, and leads to better viewing experience while watching an online video stream embedded with ads based on a study involving 17 users. We model ad emotions based on subjective human opinions as well as objective multimodal features, and show how effectively modeling ad emotions can positively impact a real-life application.Comment: Accepted at the ACM International Conference on Multimedia (ACM MM) 201

    Meditation Experiences, Self, and Boundaries of Consciousness

    Get PDF
    Our experiences with the external world are possible mainly through vision, hearing, taste, touch, and smell providing us a sense of reality. How the brain is able to seamlessly integrate stimuli from our external and internal world into our sense of reality has yet to be adequately explained in the literature. We have previously proposed a three-dimensional unified model of consciousness that partly explains the dynamic mechanism. Here we further expand our model and include illustrations to provide a better conception of the ill-defined space within the self, providing insight into a unified mind-body concept. In this article, we propose that our senses “super-impose” on an existing dynamic space within us after a slight, imperceptible delay. The existing space includes the entire intrapersonal space and can also be called the “the body’s internal 3D default space”. We provide examples from meditation experiences to help explain how the sense of ‘self’ can be experienced through meditation practice associated with underlying physiological processes that take place through cardio-respiratory synchronization and coherence that is developed among areas of the brain. Meditation practice can help keep the body in a parasympathetic dominant state during meditation, allowing an experience of inner ‘self’. Understanding this physical and functional space could help unlock the mysteries of the function of memory and cognition, allowing clinicians to better recognize and treat disorders of the mind by recommending proven techniques to reduce stress as an adjunct to medication treatment

    Evaluating Content-centric vs User-centric Ad Affect Recognition

    Get PDF
    Despite the fact that advertisements (ads) often include strongly emotional content, very little work has been devoted to affect recognition (AR) from ads. This work explicitly compares content-centric and user-centric ad AR methodologies, and evaluates the impact of enhanced AR on computational advertising via a user study. Specifically, we (1) compile an affective ad dataset capable of evoking coherent emotions across users; (2) explore the efficacy of content-centric convolutional neural network (CNN) features for encoding emotions, and show that CNN features outperform low-level emotion descriptors; (3) examine user-centered ad AR by analyzing Electroencephalogram (EEG) responses acquired from eleven viewers, and find that EEG signals encode emotional information better than content descriptors; (4) investigate the relationship between objective AR and subjective viewer experience while watching an ad-embedded online video stream based on a study involving 12 users. To our knowledge, this is the first work to (a) expressly compare user vs content-centered AR for ads, and (b) study the relationship between modeling of ad emotions and its impact on a real-life advertising application.Comment: Accepted at the ACM International Conference on Multimodal Interation (ICMI) 201

    Media Presence and Inner Presence: The Sense of Presence in Virtual Reality Technologies

    Get PDF
    Abstract. Presence is widely accepted as the key concept to be considered in any research involving human interaction with Virtual Reality (VR). Since its original description, the concept of presence has developed over the past decade to be considered by many researchers as the essence of any experience in a virtual environment. The VR generating systems comprise two main parts: a technological component and a psychological experience. The different relevance given to them produced two different but coexisting visions of presence: the rationalist and the psychological/ecological points of view. The rationalist point of view considers a VR system as a collection of specific machines with the necessity of the inclusion \ud of the concept of presence. The researchers agreeing with this approach describe the sense of presence as a function of the experience of a given medium (Media Presence). The main result of this approach is the definition of presence as the perceptual illusion of non-mediation produced by means of the disappearance of the medium from the conscious attention of the subject. At the other extreme, there \ud is the psychological or ecological perspective (Inner Presence). Specifically, this perspective considers presence as a neuropsychological phenomenon, evolved from the interplay of our biological and cultural inheritance, whose goal is the control of the human activity. \ud Given its key role and the rate at which new approaches to understanding and examining presence are appearing, this chapter draws together current research on presence to provide an up to date overview of the most widely accepted approaches to its understanding and measurement

    Biometric responses to music-rich segments in films: the CDVPlex

    Get PDF
    Summarising or generating trailers for films or movies involves finding the highlights within those films, those segments where we become most afraid, happy, sad, annoyed, excited, etc. In this paper we explore three questions related to automatic detection of film highlights by measuring the physiological responses of viewers of those films. Firstly, whether emotional highlights can be detected through viewer biometrics, secondly whether individuals watching a film in a group experience similar emotional reactions as others in the group and thirdly whether the presence of music in a film correlates with the occurrence of emotional highlights. We analyse the results of an experiment known as the CDVPlex, where we monitored and recorded physiological reactions from people as they viewed films in a controlled cinema-like environment. A selection of films were manually annotated for the locations of their emotive contents. We then studied the physiological peaks identified among participants while viewing the same film and how these correlated with emotion tags and with music. We conclude that these are highly correlated and that music-rich segments of a film do act as a catalyst in stimulating viewer response, though we don't know what exact emotions the viewers were experiencing. The results of this work could impact the way in which we index movie content on PVRs for example, paying special significance to movie segments which are most likely to be highlights

    Limits of Kansei – Kansei unlimited

    Get PDF
    This article discusses momentary limitations of the Kansei Engineering methods. There are for example the focus on the evaluation of colour and form factors, as well as the highly time consuming creation of the questionnaires. To overcome these limits we firstly suggest the integration of word lists from related research fields, like sociology and cognitive psychology on product emotions in the Kansei questionnaires. Thereafter we present a study on the wide range of Kansei attributes treated in an industrial setting. Concept words used by designers are being collected through word maps and categorized into attributes. In a third step we introduce a user-product interaction schema in which the Kansei attributes from the study are positioned. This schema unfolds potential expansion points for future applications of Kansei engineering beyond its current limits

    Emotional Brain-Computer Interfaces

    Get PDF
    Research in Brain-computer interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide signicant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the inuence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates.\ud These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of noninvasive EEG based BCIs
    corecore