25 research outputs found

    Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming

    Get PDF
    User experience estimation of VR exergame players by recognising their affective state could enable us to personalise and optimise their experience. Affect recognition based on psychophysiological measurements has been successful for moderate intensity activities. High intensity VR exergames pose challenges as the effects of exercise and VR headsets interfere with those measurements. We present two experiments that investigate the use of different sensors for affect recognition in a VR exergame. The first experiment compares the impact of physical exertion and gamification on psychophysiological measurements during rest, conventional exercise, VR exergaming, and sedentary VR gaming. The second experiment compares underwhelming, overwhelming and optimal VR exergaming scenarios. We identify gaze fixations, eye blinks, pupil diameter and skin conductivity as psychophysiological measures suitable for affect recognition in VR exergaming and analyse their utility in determining affective valence and arousal. Our findings provide guidelines for researchers of affective VR exergames.This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992 </p

    RCEA: Real-time, Continuous Emotion Annotation for collecting precise mobile video ground truth labels

    Get PDF
    Collecting accurate and precise emotion ground truth labels for mobile video watching is essential for ensuring meaningful predictions. However, video-based emotion annotation techniques either rely on post-stimulus discrete self-reports, or allow real-time, continuous emotion annotations (RCEA) only for desktop settings. Following a user-centric approach, we designed an RCEA technique for mobile video watching, and validated its usability and reliability in a controlled, indoor (N=12) and later outdoor (N=20) study. Drawing on physiological measures, interaction logs, and subjective workload reports, we show that (1) RCEA is perceived to be usable for annotating emotions while mobile video watching, without increasing users' mental workload (2) the resulting time-variant annotations are comparable with intended emotion attributes of the video stimuli (classification error for valence: 8.3%; arousal: 25%). We contribute a validated annotation technique and associated annotation fusion method, that is suitable for collecting fine-grained emotion annotations while users watch mobile videos

    Imprinting: Effects of painful stimulation upon the following response.

    No full text

    Brightness contrast effects in a pupillometric experiment

    No full text
    corecore