3,172 research outputs found
Generalized Sparse Discriminant Analysis for Event-Related Potential Classification
A brain computer interface (BCI) is a system which provides direct communication between the mind of a person and the outside world by using only brain activity (EEG). The event-related potential (ERP)-based BCI problem consists of a binary pattern recognition. Linear discriminant analysis (LDA) is widely used to solve this type of classification problems, but it fails when the number of features is large relative to the number of observations. In this work we propose a penalized version of the sparse discriminant analysis (SDA), called generalized sparse discriminant analysis (GSDA), for binary classification. This method inherits both the discriminative feature selection and classification properties of SDA and it also improves SDA performance through the addition of Kullback-Leibler class discrepancy information. The GSDA method is designed to automatically select the optimal regularization parameters. Numerical experiments with two real ERP-EEG datasets show that, on one hand, GSDA outperforms standard SDA in the sense of classification performance, sparsity and required computing time, and, on the other hand, it also yields better overall performances, compared to well-known ERP classification algorithms, for single-trial ERP classification when insufficient training samples are available. Hence, GSDA constitute a potential useful method for reducing the calibration times in ERP-based BCI systems.Fil: Peterson, Victoria. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Rufiner, Hugo Leonardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; Argentina. Universidad Nacional de Entre Ríos. Facultad de Ingeniería; ArgentinaFil: Spies, Ruben Daniel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Matemática Aplicada del Litoral. Universidad Nacional del Litoral. Instituto de Matemática Aplicada del Litoral; Argentina. Universidad Nacional del Litoral. Facultad de Ingeniería Química; Argentin
Discovering Gender Differences in Facial Emotion Recognition via Implicit Behavioral Cues
We examine the utility of implicit behavioral cues in the form of EEG brain
signals and eye movements for gender recognition (GR) and emotion recognition
(ER). Specifically, the examined cues are acquired via low-cost, off-the-shelf
sensors. We asked 28 viewers (14 female) to recognize emotions from unoccluded
(no mask) as well as partially occluded (eye and mouth masked) emotive faces.
Obtained experimental results reveal that (a) reliable GR and ER is achievable
with EEG and eye features, (b) differential cognitive processing especially for
negative emotions is observed for males and females and (c) some of these
cognitive differences manifest under partial face occlusion, as typified by the
eye and mouth mask conditions.Comment: To be published in the Proceedings of Seventh International
Conference on Affective Computing and Intelligent Interaction.201
ICLabel: An automated electroencephalographic independent component classifier, dataset, and website
The electroencephalogram (EEG) provides a non-invasive, minimally
restrictive, and relatively low cost measure of mesoscale brain dynamics with
high temporal resolution. Although signals recorded in parallel by multiple,
near-adjacent EEG scalp electrode channels are highly-correlated and combine
signals from many different sources, biological and non-biological, independent
component analysis (ICA) has been shown to isolate the various source generator
processes underlying those recordings. Independent components (IC) found by ICA
decomposition can be manually inspected, selected, and interpreted, but doing
so requires both time and practice as ICs have no particular order or intrinsic
interpretations and therefore require further study of their properties.
Alternatively, sufficiently-accurate automated IC classifiers can be used to
classify ICs into broad source categories, speeding the analysis of EEG studies
with many subjects and enabling the use of ICA decomposition in near-real-time
applications. While many such classifiers have been proposed recently, this
work presents the ICLabel project comprised of (1) an IC dataset containing
spatiotemporal measures for over 200,000 ICs from more than 6,000 EEG
recordings, (2) a website for collecting crowdsourced IC labels and educating
EEG researchers and practitioners about IC interpretation, and (3) the
automated ICLabel classifier. The classifier improves upon existing methods in
two ways: by improving the accuracy of the computed label estimates and by
enhancing its computational efficiency. The ICLabel classifier outperforms or
performs comparably to the previous best publicly available method for all
measured IC categories while computing those labels ten times faster than that
classifier as shown in a rigorous comparison against all other publicly
available EEG IC classifiers.Comment: Intended for NeuroImage. Updated from version one with minor
editorial and figure change
Converting Your Thoughts to Texts: Enabling Brain Typing via Deep Feature Learning of EEG Signals
An electroencephalography (EEG) based Brain Computer Interface (BCI) enables
people to communicate with the outside world by interpreting the EEG signals of
their brains to interact with devices such as wheelchairs and intelligent
robots. More specifically, motor imagery EEG (MI-EEG), which reflects a
subjects active intent, is attracting increasing attention for a variety of BCI
applications. Accurate classification of MI-EEG signals while essential for
effective operation of BCI systems, is challenging due to the significant noise
inherent in the signals and the lack of informative correlation between the
signals and brain activities. In this paper, we propose a novel deep neural
network based learning framework that affords perceptive insights into the
relationship between the MI-EEG data and brain activities. We design a joint
convolutional recurrent neural network that simultaneously learns robust
high-level feature presentations through low-dimensional dense embeddings from
raw MI-EEG signals. We also employ an Autoencoder layer to eliminate various
artifacts such as background activities. The proposed approach has been
evaluated extensively on a large- scale public MI-EEG dataset and a limited but
easy-to-deploy dataset collected in our lab. The results show that our approach
outperforms a series of baselines and the competitive state-of-the- art
methods, yielding a classification accuracy of 95.53%. The applicability of our
proposed approach is further demonstrated with a practical BCI system for
typing.Comment: 10 page
Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy
Previous fMRI studies have reported mixed evidence for the influence of selective attention on amygdala responses to emotional stimuli, with some studies showing "automatic" emotional effects to threat-related stimuli without attention (or even without awareness), but other studies showing a gating of amygdala activity by selective attention with no response to unattended stimuli. We recorded intracranial local field potentials from the intact left lateral amygdala in a human patient prior to surgery for epilepsy and tested, with a millisecond time resolution, for neural responses to fearful faces appearing at either task-relevant or task-irrelevant locations. Our results revealed an early emotional effect in the amygdala arising prior to, and independently of, attentional modulation. However, at a later latency, we found a significant modulation of the differential emotional response when attention was directed toward or away from fearful faces. These results suggest separate influences of emotion and attention on amygdala activation and may help reconcile previous discrepancies concerning the relative responsiveness of the human amygdala to emotional and attentional factors
Feature selection for EEG Based biometrics
Department of Human Factors EngineeringEEG-based biometrics identify individuals by using personal and distinctive information in human brain. This thesis aims to evaluate the electroencephalography (EEG) features and channels for biometrics and to propose methodology that identifies individuals. In my research, I recorded fourteen EEG channel signals from thirty subjects. While record EEG signal, subjects were asked to relax and keep eyes closed for 2 minutes. In addition, to evaluate intra-individual variability, we recorded EEG ten times for each subject, and every recording conducted on different days to reduce within-day effects. After acquisition of data, for each channel, I calculated eight features: alpha/beta power ratio, alpha/theta power ratio, beta/theta power ratio, median frequency, PSD entropy, permutation entropy, sample entropy, and maximum Lyapunov exponents. Then, I scored 112 features with three feature selection algorithms: Fisher score, reliefF, and information gain. Finally, I classified EEG data using a linear discriminant analysis (LDA) with a leave-one-out cross validation method. As a result, the best feature set was composed of 23 features that highly ranked on Fisher score and yielded a 18.56% half total error rate. In addition, according to scores calculated by feature selection, EEG channels located on occipital and right temporal areas most contributed to identify individuals. Thus, with suggested methodologies and channels, implementation of efficient EEG-based biometrics is possible.ope
Towards emotion recognition for virtual environments: an evaluation of eeg features on benchmark dataset
One of the challenges in virtual environments is the difficulty users have in interacting with these increasingly complex systems. Ultimately, endowing machines with the ability to perceive users emotions will enable a more intuitive and reliable interaction. Consequently, using the electroencephalogram as a bio-signal sensor, the affective state of a user can be modelled and subsequently utilised in order to achieve a system that can recognise and react to the user’s emotions. This paper investigates features extracted from electroencephalogram signals for the purpose of affective state modelling based on Russell’s Circumplex Model. Investigations are presented that aim to provide the foundation for future work in modelling user affect to enhance interaction experience in virtual environments. The DEAP dataset was used within this work, along with a Support Vector Machine and Random Forest, which yielded reasonable classification accuracies for Valence and Arousal using feature vectors based on statistical measurements and band power from the and waves and High Order Crossing of the EEG signal
- …