2,571 research outputs found

    Emotions Detection based on a Single-electrode EEG Device

    Get PDF
    The study of emotions using multiple channels of EEG represents a widespread practice in the field of research related to brain computer interfaces (Brain Computer Interfaces). To date, few studies have been reported in the literature with a reduced number of channels, which when used in the detection of emotions present results that are less accurate than the rest. To detect emotions using an EEG channel and the data obtained is useful for classifying emotions with an accuracy comparable to studies in which there is a high number of channels, is of particular interest in this research framework. This article uses the Neurosky Maindwave device; which has a single electrode to acquire the EEG signal, Matlab software and IBM SPSS Modeler; which process and classify the signals respectively. The accuracy obtained in the detection of emotions in relation to the economic resources of the hardware dedicated to the acquisition of EEG signal is remarkable

    Emotion Detection Using Noninvasive Low Cost Sensors

    Full text link
    Emotion recognition from biometrics is relevant to a wide range of application domains, including healthcare. Existing approaches usually adopt multi-electrodes sensors that could be expensive or uncomfortable to be used in real-life situations. In this study, we investigate whether we can reliably recognize high vs. low emotional valence and arousal by relying on noninvasive low cost EEG, EMG, and GSR sensors. We report the results of an empirical study involving 19 subjects. We achieve state-of-the- art classification performance for both valence and arousal even in a cross-subject classification setting, which eliminates the need for individual training and tuning of classification models.Comment: To appear in Proceedings of ACII 2017, the Seventh International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, Oct. 23-26, 201

    Emotional Brain-Computer Interfaces

    Get PDF
    Research in Brain-computer interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide signicant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the inuence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates.\ud These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of noninvasive EEG based BCIs

    A real time classification algorithm for EEG-based BCI driven by self-induced emotions

    Get PDF
    Background and objective: The aim of this paper is to provide an efficient, parametric, general, and completely automatic real time classification method of electroencephalography (EEG) signals obtained from self-induced emotions. The particular characteristics of the considered low-amplitude signals (a self-induced emotion produces a signal whose amplitude is about 15% of a really experienced emotion) require exploring and adapting strategies like the Wavelet Transform, the Principal Component Analysis (PCA) and the Support Vector Machine (SVM) for signal processing, analysis and classification. Moreover, the method is thought to be used in a multi-emotions based Brain Computer Interface (BCI) and, for this reason, an ad hoc shrewdness is assumed. Method: The peculiarity of the brain activation requires ad-hoc signal processing by wavelet decomposition, and the definition of a set of features for signal characterization in order to discriminate different self-induced emotions. The proposed method is a two stages algorithm, completely parameterized, aiming at a multi-class classification and may be considered in the framework of machine learning. The first stage, the calibration, is off-line and is devoted at the signal processing, the determination of the features and at the training of a classifier. The second stage, the real-time one, is the test on new data. The PCA theory is applied to avoid redundancy in the set of features whereas the classification of the selected features, and therefore of the signals, is obtained by the SVM. Results: Some experimental tests have been conducted on EEG signals proposing a binary BCI, based on the self-induced disgust produced by remembering an unpleasant odor. Since in literature it has been shown that this emotion mainly involves the right hemisphere and in particular the T8 channel, the classification procedure is tested by using just T8, though the average accuracy is calculated and reported also for the whole set of the measured channels. Conclusions: The obtained classification results are encouraging with percentage of success that is, in the average for the whole set of the examined subjects, above 90%. An ongoing work is the application of the proposed procedure to map a large set of emotions with EEG and to establish the EEG headset with the minimal number of channels to allow the recognition of a significant range of emotions both in the field of affective computing and in the development of auxiliary communication tools for subjects affected by severe disabilities

    Data-driven multivariate and multiscale methods for brain computer interface

    Get PDF
    This thesis focuses on the development of data-driven multivariate and multiscale methods for brain computer interface (BCI) systems. The electroencephalogram (EEG), the most convenient means to measure neurophysiological activity due to its noninvasive nature, is mainly considered. The nonlinearity and nonstationarity inherent in EEG and its multichannel recording nature require a new set of data-driven multivariate techniques to estimate more accurately features for enhanced BCI operation. Also, a long term goal is to enable an alternative EEG recording strategy for achieving long-term and portable monitoring. Empirical mode decomposition (EMD) and local mean decomposition (LMD), fully data-driven adaptive tools, are considered to decompose the nonlinear and nonstationary EEG signal into a set of components which are highly localised in time and frequency. It is shown that the complex and multivariate extensions of EMD, which can exploit common oscillatory modes within multivariate (multichannel) data, can be used to accurately estimate and compare the amplitude and phase information among multiple sources, a key for the feature extraction of BCI system. A complex extension of local mean decomposition is also introduced and its operation is illustrated on two channel neuronal spike streams. Common spatial pattern (CSP), a standard feature extraction technique for BCI application, is also extended to complex domain using the augmented complex statistics. Depending on the circularity/noncircularity of a complex signal, one of the complex CSP algorithms can be chosen to produce the best classification performance between two different EEG classes. Using these complex and multivariate algorithms, two cognitive brain studies are investigated for more natural and intuitive design of advanced BCI systems. Firstly, a Yarbus-style auditory selective attention experiment is introduced to measure the user attention to a sound source among a mixture of sound stimuli, which is aimed at improving the usefulness of hearing instruments such as hearing aid. Secondly, emotion experiments elicited by taste and taste recall are examined to determine the pleasure and displeasure of a food for the implementation of affective computing. The separation between two emotional responses is examined using real and complex-valued common spatial pattern methods. Finally, we introduce a novel approach to brain monitoring based on EEG recordings from within the ear canal, embedded on a custom made hearing aid earplug. The new platform promises the possibility of both short- and long-term continuous use for standard brain monitoring and interfacing applications
    corecore