4 research outputs found
Exploring Emotions and Engagement: A Multi-componential Analysis Using Films and Virtual Reality
In the digital age, where our lives are intertwined with intelligent systems and immersive experiences, understanding how emotions are shaped and influenced is more crucial than ever. Despite the attention to discrete and dimensional models, neuroscientific evidence supports that emotions are complex and multi-faceted. While the Component Process Model (CPM) acknowledges the complexity of emotions through its five interconnected components: appraisal, motivation, physiology, expression, and feeling, it has received limited attention in Affective Computing.
Despite some recent advances in full CPM research, limitations exist. The relatively narrow emphasis on full CPM has resulted in a scarcity of available datasets for in-depth exploration. Most of these datasets are film-based, with only one in Virtual Reality (VR), and all have received limited computational analysis, especially in exploratory and Machine Learning aspects. Passive film-based emotion induction has merits and limitations, as it positions participants as observers. Introducing active VR stimuli can enhance emotion elicitation due to its immersive nature, but current CPM VR analyses rely on subjective reports. VR as an empathy machine is often identified in cutting-edge emotion research; however, limited attention has been given to understanding these attributes of VR, such as engagement.
This thesis aims to comprehend emotions through full CPM with computational models. It starts with analysing a film-based dataset having subjective and objective measures and presents the role of physiology in emotion discrimination. Subsequently, we underscore the significance of micro-level annotations using another film-based dataset with larger continuous subjective annotations. The thesis also introduces a data-driven approach using interactive VR games and collected multimodal measures (self-reports, physiological, facial expressions, and movements) from 39 participants. The new dataset shows the role of different components in emotion differentiation when emotions are induced actively. Furthermore, the thesis presents an innovative approach to measuring engagement in VR games. We examine the simultaneous occurrence of player motivation and physiological responses to explore potential associations with body movements.
Our explorations into emotions and engagement within a multi-componential framework, utilising both films and VR games, present numerous opportunities for advancing our understanding of human behaviour and interactions to foster a more empathetic world
EmoStim: A Database of Emotional Film Clips with Discrete and Componential Assessment
Emotion elicitation using emotional film clips is one of the most common and
ecologically valid methods in Affective Computing. However, selecting and
validating appropriate materials that evoke a range of emotions is challenging.
Here we present EmoStim: A Database of Emotional Film Clips as a film library
with a rich and varied content. EmoStim is designed for researchers interested
in studying emotions in relation to either discrete or componential models of
emotion. To create the database, 139 film clips were selected from literature
and then annotated by 638 participants through the CrowdFlower platform. We
selected 99 film clips based on the distribution of subjective ratings that
effectively distinguished between emotions defined by the discrete model. We
show that the selected film clips reliably induce a range of specific emotions
according to the discrete model. Further, we describe relationships between
emotions, emotion organization in the componential space, and underlying
dimensions representing emotional experience. The EmoStim database and
participant annotations are freely available for research purposes. The
database can be used to enrich our understanding of emotions further and serve
as a guide to select or create additional materials.Comment: This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessibl
EmoStim Dataset
AbstractEmoStim is a Database of Emotional Film Clips as a film library with a rich and varied content. EmoStim is designed for researchers interested in studying emotions in relation to either discrete or componential models of emotion. To create the database, 139 film clips were selected from literature and then annotated by 638 participants through the CrowdFlower platform
Emotion is a multi-componential experience guided by appraisal: evidence from multi-level annotation during naturalistic stimulation
This study discerns the relationship between discrete emotions and their underlying components from a detailed dataset of continuous annotations of more than 50 emotion variables during short films. Appraisal theories predict that discrete emotions arise from a combination of components. Specifically, the Component Process Model (CPM) highlights the prime role of appraisal following motivation, expression, physiology and feeling. We include annotations from all these domains and reveal a hierarchical organisation of discrete emotions by appraisal of valence and self-relevance. Furthermore, we apply predictive models to understand the contribution of emotion components to discrete emotions. We find that all 13 discrete emotions in our dataset can be significantly predicted as a function of emotion components. Our study contributes key insights using machine learning to the longstanding question of what is an emotion and underscores the centrality of appraisal in the generation of emotion. This has important implications on the complexity and function of emotion as an adaptive process