4 research outputs found
Implicit Human-computer Interaction: Two complementary Approaches
One of the main goals of Human Computer Interaction (HCI) is to improve the interface between users and computers: interfacing should be effortless and easy to learn. In this thesis, we pursue this goal, aiming to reduce the stress of users and increase their wellbeing. We work on two different but complementary approaches: (i) Automatic assessment of user’s inner psychological state, so as to enhance computer-human interaction; and (ii) Information presentation in a comprehensive manner, with no stress added by devices and applications when delivering information. Not only computers should understand their users, but also users should easily understand the information given by computers.
For the first approach we collected physiological and psychological data from people exposed to emotional stimuli. We created a database, and made it freely available to the community, for further use in research on automated detection of the differences in the inner states of users. We employed the data for predicting both the emotional state of users and their personality traits. For the second approach, we investigated two devices that intend to provide comprehensible feedback easily. First we discuss how to utilize a breathing sensor that informs its users on their current physiological state and on how to decrease the stress in daily life by adapting their breathing patterns. Here we investigated general criteria on how to develop systems that are easily understandable. The second device was a tactile belt. We analyze the belt as a solution that provides comprehensive guidance information in navigation contexts, and that does not require cognitive effort. The belt uses localized tactile stimulation to transmit directional information. By employing the tactile sense it can augment or even replace the information normally received through eyes and ears. Finally, we discuss opportunities for future applications of our research, and conclude with a summary of our contributions to HCI: transmitting information from humans to machines and vice versa
ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors
We present ASCERTAIN-a multimodal databaASe for impliCit pERsonali Ty and Affect recognitIoN using commercial physiological sensors. To our knowledge, ASCERTAIN is the first database to connect personality traits and emotional states via physiological responses. ASCERTAIN contains big-five personality scales and emotional self-ratings of 58 users along with their Electroencephalogram (EEG), Electrocardiogram (ECG), Galvanic Skin Response (GSR) and facial activity data, recorded using off-The-shelf sensors while viewing affective movie clips. We first examine relationships between users' affective ratings and personality scales in the context of prior observations, and then study linear and non-linear physiological correlates of emotion and personality. Our analysis suggests that the emotion-personality relationship is better captured by non-linear rather than linear statistics. We finally attempt binary emotion and personality trait recognition using physiological features. Experimental results cumulatively confirm that personality differences are better revealed while comparing user responses to emotionally homogeneous videos, and above-chance recognition is achieved for both affective and personality dimensions.</p
Implicit user-centric personality recognition based on physiological responses to emotional videos
We present a novel framework for recognizing personality traits based on users' physiological responses to affective movie clips. Extending studies that have correlated explicit/implicit affective user responses with Extraversion and Neuroticism traits, we perform single-trial recognition of the big-five traits from Electrocardiogram (ECG), Galvanic Skin Response (GSR), Electroencephalogram (EEG) and facial emotional responses compiled from 36 users using off-the-shelf sensors. Firstly, we examine relationships among personality scales and (explicit) affective user ratings acquired in the context of prior observations. Secondly, we isolate physiological correlates of personality traits. Finally, unimodal and multimodal personality recognition results are presented. Personality differences are better revealed while analyzing responses to emotionally homogeneous (e.g., high valence, high arousal) clips, and significantly above-chance recognition is achieved for all five traits.</p