13 research outputs found

    Extracting Relevance and Affect Information from Physiological Text Annotation

    Get PDF
    We present physiological text annotation, which refers to the practice of associating physiological responses to text content in order to infer characteristics of the user information needs and affective responses. Text annotation is a laborious task, and implicit feedback has been studied as a way to collect annotations without requiring any explicit action from the user. Previous work has explored behavioral signals, such as clicks or dwell time to automatically infer annotations, and physiological signals have mostly been explored for image or video content. We report on two experiments in which physiological text annotation is studied first to 1) indicate perceived relevance and then to 2) indicate affective responses of the users. The first experiment tackles the user’s perception of relevance of an information item, which is fundamental towards revealing the user’s information needs. The second experiment is then aimed at revealing the user’s affective responses towards a -relevant- text document. Results show that physiological user signals are associated with relevance and affect. In particular, electrodermal activity (EDA) was found to be different when users read relevant content than when they read irrelevant content and was found to be lower when reading texts with negative emotional content than when reading texts with neutral content. Together, the experiments show that physiological text annotation can provide valuable implicit inputs for personalized systems. We discuss how our findings help design personalized systems that can annotate digital content using human physiology without the need for any explicit user interaction

    Exploring eye-gaze behaviors on emotional states for human-computer interaction (HCI) using eye tracking technique

    No full text
    Human computer interaction (HCI) is becoming an essential area of study these days. However, most contemporary HCI systems are unable to identify human emotional states and use this information in deciding upon proper actions to execute. To solve this problem, eye tracking has been introduced to record the eye gaze behaviours which can signify the insight of emotions. This manuscript will look into this study by empirically attempt to apply the eye tracking device and method to investigate the relationship of human emotions and eye-gaze behaviour. We explored the pupil size and fixation duration stimulated by film clips of different arousal content using the eye tracking technology. Fifteen students from Malaysian-Japan International Institute of Technology (MJIIT) are chosen. Emotions are analysed by studying the eye gaze behaviours using five emotional video stimuli e.g. Amusement, Joy, Neutral, Sad and Fear. These stimuli are displayed to the subjects to obtain and record their gaze behaviour using the Tobii TX300 eye tracker. The results which are obtained are analysed using statistical ANOVA analysis. The ANOVA analysis shows that the significance, p value less than 0.05 for both fixation duration and pupil dilation which indicates there is significant relationship between eye-gaze behaviour and human emotions

    Detecting and influencing driver emotions using psycho-physiological sensors and ambient light

    Get PDF
    Driving is a sensitive task that is strongly affected by the driver's emotions. Negative emotions, such as anger, can evidently lead to more driving errors. In this work, we introduce a concept of detecting and influencing driver emotions using psycho-physiological sensing for emotion classification and ambient light for feedback. We detect arousal and valence of emotional responses from wearable bio-electric sensors, namely brain-computer interfaces and heart rate sensors. We evaluated our concept in a static driving simulator with a fully equipped car with 12 participants. Before the rides, we elicit negative emotions and evaluate driving performance and physiological data while driving under stressful conditions. We use three ambient lighting conditions (no light, blue, orange). Using a subject-dependent random forests classifier with 40 features collected from physiological data we achieve an average accuracy of 78.9% for classifying valence and 68.7% for arousal. Driving performance was enhanced in conditions where ambient lighting was introduced. Both blue and orange light helped drivers to improve lane keeping. We discuss insights from our study and provide design recommendations for designing emotion sensing and feedback systems in the car
    corecore