3 research outputs found

    Classification of Affective States in the Electroencephalogram

    Get PDF
    The goal of the present work is to investigate the feasibility of automatic affect recognition in the electroencephalogram (EEG) in different populations with a focus on feature validation and machine learning in order to augment brain-computer interface systems by the ability to identify and communicate the users’ inner affective state. Two in-depth studies on affect induction and classification are presented. In the first study, an auditory emotion induction paradigm that easily translates to a clinical population is introduced. Significant above chance group classification is achieved using time domain features for unpleasant vs. pleasant conditions. In the second study, data of an emotion induction paradigm for preverbal infants are investigated. Employing the machine learning framework, cross-participant classification of pleasant vs. neutral conditions is significantly above chance with balanced training data. Furthermore, the machine learning framework is applied to the publicly available physiological affect dataset DEAP for comparison of results. Based on spectral frequency features, the framework introduced outperforms results published by the authors of DEAP. The results strengthen the vision of the feasibility of a BCI that is able to identify and communicate the users’ affective state

    Augmenting communication, emotion expression and interaction capabilities of individuals with cerebral palsy

    Get PDF
    !c 2014Verlag der Technischen Universit¨at Graz. Providing individuals with cerebral palsy (CP) tools to communicate and interact with the environment independently and reliably since childhood would allow for a more active participation in education and social life. We outline first steps towards the development of such a hybrid brain-computer interface-based (BCI) communication tool.This work was supported by the FP7 Framework EU Research Project ABC (No. 287774). This paper only reflects the authors views and funding agencies are not liable for any use that may be made of the information contained herein.Peer Reviewe

    EEG responses to auditory stimuli for automatic affect recognition

    Get PDF
    Brain state classification for communication and control has been well established in the area of brain-computer interfaces over the last decades. Recently, the passive and automatic extraction of additional information regarding the psychological state of users from neurophysiological signals has gained increased attention in the interdisciplinary field of affective computing. We investigated how well specific emotional reactions, induced by auditory stimuli, can be detected in EEG recordings. We introduce an auditory emotion induction paradigm based on the International Affective Digitized Sounds 2nd Edition (IADS-2) database also suitable for disabled individuals. Stimuli are grouped in three valence categories: unpleasant, neutral, and pleasant. Significant differences in time domain domain event-related potentials are found in the electroencephalogram (EEG) between unpleasant and neutral, as well as pleasant and neutral conditions over midline electrodes. Time domain data were classified in three binary classification problems using a linear support vector machine (SVM) classifier. We discuss three classification performance measures in the context of affective computing and outline some strategies for conducting and reporting affect classification studies
    corecore