2,744 research outputs found

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming

    Full text link
    Recent research has focused on the effectiveness of Virtual Reality (VR) in games as a more immersive method of interaction. However, there is a lack of robust analysis of the physiological effects between VR and flatscreen (FS) gaming. This paper introduces the first systematic comparison and analysis of emotional and physiological responses to commercially available games in VR and FS environments. To elicit these responses, we first selected four games through a pilot study of 6 participants to cover all four quadrants of the valence-arousal space. Using these games, we recorded the physiological activity, including Blood Volume Pulse and Electrodermal Activity, and self-reported emotions of 33 participants in a user study. Our data analysis revealed that VR gaming elicited more pronounced emotions, higher arousal, increased cognitive load and stress, and lower dominance than FS gaming. The Virtual Reality and Flat Screen (VRFS) dataset, containing over 15 hours of multimodal data comparing FS and VR gaming across different games, is also made publicly available for research purposes. Our analysis provides valuable insights for further investigations into the physiological and emotional effects of VR and FS gaming.Comment: This work has been submitted to the IEEE Transactions on Affective Computing for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    Autonomous Assessment of Videogame Difficulty Using Physiological Signals

    Get PDF
    Given the well-explored relation between challenge and involvement in a task, (e.g., as described in Csikszentmihalyi’s theory of flow), it could be argued that the presence of challenge in videogames is a core element that shapes player experiences and should, therefore, be matched to the player’s skills and attitude towards the game. However, handling videogame difficulty, is a challenging problem in game design, as too easy a task can lead to boredom and too hard can lead to frustration. Thus, by exploring the relationship between difficulty and emotion, the current work intends to propose an artificial intelligence model that autonomously predicts difficulty according to the set of emotions elicited in the player. To test the validity of this approach, we developed a simple puzzle-based Virtual Reality (VR) videogame, based on the Trail Making Test (TMT), and whose objective was to elicit different emotions according to three levels of difficulty. A study was carried out in which physiological responses as well as player self- reports were collected during gameplay. Statistical analysis of the self-reports showed that different levels of experience with either VR or videogames didn’t have a measurable impact on how players performed during the three levels. Additionally, the self-assessed emotional ratings indicated that playing the game at different difficulty levels gave rise to different emotional states. Next, classification using a Support Vector Machine (SVM) was performed to verify if it was possible to detect difficulty considering the physiological responses associated with the elicited emotions. Results report an overall F1-score of 68% in detecting the three levels of difficulty, which verifies the effectiveness of the adopted methodology and encourages further research with a larger dataset.Dada a relação bem explorada entre desafio e envolvimento numa tarefa (p. ex., con- forme descrito na teoria do fluxo de Csikszentmihalyi), pode-se argumentar que a pre- sença de desafio em videojogos é um elemento central que molda a experiência do jogador e deve, portanto, ser compatível com as habilidades e a atitude que jogador exibe perante o jogo. No entanto, saber como lidar com a dificuldade de um videojogo é um problema desafiante no design de jogos, pois uma tarefa muito fácil pode gerar tédio e muito di- fícil pode levar à frustração. Assim, ao explorar a relação entre dificuldade e emoção, o presente trabalho pretende propor um modelo de inteligência artificial que preveja de forma autônoma a dificuldade de acordo com o conjunto de emoções elicitadas no jogador. Para testar a validade desta abordagem, desenvolveu-se um jogo de puzzle em Realidade Virtual (RV), baseado no Trail Making Test (TMT), e cujo objetivo era elicitar diferentes emoções tendo em conta três níveis de dificuldade. Foi realizado um estudo no qual se recolheram as respostas fisiológicas, juntamente com os autorrelatos dos jogado- res, durante o jogo. A análise estatística dos autorelatos mostrou que diferentes níveis de experiência com RV ou videojogos não tiveram um impacto mensurável no desempenho dos jogadores durante os três níveis. Além disso, as respostas emocionais auto-avaliadas indicaram que jogar o jogo em diferentes níveis de dificuldade deu origem a diferentes estados emocionais. Em seguida, foi realizada a classificação por intermédio de uma Má- quina de Vetores de Suporte (SVM) para verificar se era possível detectar dificuldade, considerando as respostas fisiológicas associadas às emoções elicitadas. Os resultados re- latam um F1-score geral de 68% na detecção dos três níveis de dificuldade, o que verifica a eficácia da metodologia adotada e incentiva novas pesquisas com um conjunto de dados maior

    Designing interactive virtual environments with feedback in health applications.

    Get PDF
    One of the most important factors to influence user experience in human-computer interaction is the user emotional reaction. Interactive environments including serious games that are responsive to user emotions improve their effectiveness and user satisfactions. Testing and training for user emotional competence is meaningful in healthcare field, which has motivated us to analyze immersive affective games using emotional feedbacks. In this dissertation, a systematic model of designing interactive environment is presented, which consists of three essential modules: affect modeling, affect recognition, and affect control. In order to collect data for analysis and construct these modules, a series of experiments were conducted using virtual reality (VR) to evoke user emotional reactions and monitoring the reactions by physiological data. The analysis results lead to the novel approach of a framework to design affective gaming in virtual reality, including the descriptions on the aspects of interaction mechanism, graph-based structure, and user modeling. Oculus Rift was used in the experiments to provide immersive virtual reality with affective scenarios, and a sample application was implemented as cross-platform VR physical training serious game for elderly people to demonstrate the essential parts of the framework. The measurements of playability and effectiveness are discussed. The introduced framework should be used as a guiding principle for designing affective VR serious games. Possible healthcare applications include emotion competence training, educational softwares, as well as therapy methods

    Resonating Experiences of Self and Others enabled by a Tangible Somaesthetic Design

    Get PDF
    Digitalization is penetrating every aspect of everyday life including a human's heart beating, which can easily be sensed by wearable sensors and displayed for others to see, feel, and potentially "bodily resonate" with. Previous work in studying human interactions and interaction designs with physiological data, such as a heart's pulse rate, have argued that feeding it back to the users may, for example support users' mindfulness and self-awareness during various everyday activities and ultimately support their wellbeing. Inspired by Somaesthetics as a discipline, which focuses on an appreciation of the living body's role in all our experiences, we designed and explored mobile tangible heart beat displays, which enable rich forms of bodily experiencing oneself and others in social proximity. In this paper, we first report on the design process of tangible heart displays and then present results of a field study with 30 pairs of participants. Participants were asked to use the tangible heart displays during watching movies together and report their experience in three different heart display conditions (i.e., displaying their own heart beat, their partner's heart beat, and watching a movie without a heart display). We found, for example that participants reported significant effects in experiencing sensory immersion when they felt their own heart beats compared to the condition without any heart beat display, and that feeling their partner's heart beats resulted in significant effects on social experience. We refer to resonance theory to discuss the results, highlighting the potential of how ubiquitous technology could utilize physiological data to provide resonance in a modern society facing social acceleration.Comment: 18 page

    PhysioVR: a novel mobile virtual reality framework for physiological computing

    Get PDF
    Virtual Reality (VR) is morphing into a ubiquitous technology by leveraging of smartphones and screenless cases in order to provide highly immersive experiences at a low price point. The result of this shift in paradigm is now known as mobile VR (mVR). Although mVR offers numerous advantages over conventional immersive VR methods, one of the biggest limitations is related with the interaction pathways available for the mVR experiences. Using physiological computing principles, we created the PhysioVR framework, an Open-Source software tool developed to facilitate the integration of physiological signals measured through wearable devices in mVR applications. PhysioVR includes heart rate (HR) signals from Android wearables, electroencephalography (EEG) signals from a low cost brain computer interface and electromyography (EMG) signals from a wireless armband. The physiological sensors are connected with a smartphone via Bluetooth and the PhysioVR facilitates the streaming of the data using UDP communication protocol, thus allowing a multicast transmission for a third party application such as the Unity3D game engine. Furthermore, the framework provides a bidirectional communication with the VR content allowing an external event triggering using a real-time control as well as data recording options. We developed a demo game project called EmoCat Rescue which encourage players to modulate HR levels in order to successfully complete the in-game mission. EmoCat Rescue is included in the PhysioVR project which can be freely downloaded. This framework simplifies the acquisition, streaming and recording of multiple physiological signals and parameters from wearable consumer devices providing a single and efficient interface to create novel physiologically-responsive mVR applications.info:eu-repo/semantics/publishedVersio

    Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback

    Get PDF
    With the advent of consumer grade virtual reality (VR) headsets and physiological measurement devices, new possibilities for mediated social interaction emerge enabling the immersion to environments where the visual features react to the users' physiological activation. In this study, we investigated whether and how individual and interpersonally shared biofeedback (visualised respiration rate and frontal asymmetry of electroencephalography, EEG) enhance synchrony between the users' physiological activity and perceived empathy towards the other during a compassion meditation exercise carried out in a social VR setting. The study was conducted as a laboratory experiment (N = 72) employing a Unity3D-based Dynecom immersive social meditation environment and two amplifiers to collect the psychophysiological signals for the biofeedback. The biofeedback on empathy-related EEG frontal asymmetry evoked higher self-reported empathy towards the other user than the biofeedback on respiratory activation, but the perceived empathy was highest when both feedbacks were simultaneously presented. In addition, the participants reported more empathy when there was stronger EEG frontal asymmetry synchronization between the users. The presented results inform the field of affective computing on the possibilities that VR offers for different applications of empathic technologies.Peer reviewe

    Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance

    Get PDF
    he investigation of the physiological and pathological processes involved in fear perception is complicated due to the difficulties in reliably eliciting and measuring the complex construct of fear. This study proposes a novel approach to induce and measure subjective fear and its physiological correlates combining virtual reality (VR) with a mixed-effects model based on skin conductance (SC). Specifically, we developed a new VR scenario applying specific guidelines derived from horror movies and video games. Such a VR environment was used to induce fear in eighteen volunteers in an experimental protocol, including two relaxation scenarios and a neutral virtual environment. The SC signal was acquired throughout the experiment, and after each virtual scenario, the emotional state and fear perception level were assessed using psychometric scales. We statistically evaluated the greatest sympathetic activation induced by the fearful scenario compared to the others, showing significant results for most SC-derived features. Finally, we developed a rigorous mixed-effects model to explain the perceived fear as a function of the SC features. Model-fitting results showed a significant relationship between the fear perception scores and a combination of features extracted from both fast- and slow-varying SC components, proposing a novel solution for a more objective fear assessme

    Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing

    Full text link
    [EN] Emotions play a critical role in our daily lives, so the understanding and recognition of emotional responses is crucial for human research. Affective computing research has mostly used non-immersive two-dimensional (2D) images or videos to elicit emotional states. However, immersive virtual reality, which allows researchers to simulate environments in controlled laboratory conditions with high levels of sense of presence and interactivity, is becoming more popular in emotion research. Moreover, its synergy with implicit measurements and machine-learning techniques has the potential to impact transversely in many research areas, opening new opportunities for the scientific community. This paper presents a systematic review of the emotion recognition research undertaken with physiological and behavioural measures using head-mounted displays as elicitation devices. The results highlight the evolution of the field, give a clear perspective using aggregated analysis, reveal the current open issues and provide guidelines for future research.This research was funded by European Commission, grant number H2020-825585 HELIOS.Marín-Morales, J.; Llinares Millán, MDC.; Guixeres Provinciale, J.; Alcañiz Raya, ML. (2020). Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors. 20(18):1-26. https://doi.org/10.3390/s20185163S126201
    • …
    corecore