6,016 research outputs found
Optimal set of EEG features for emotional state classification and trajectory visualization in Parkinson's disease
In addition to classic motor signs and symptoms, individuals with Parkinson's disease (PD) are characterized by emotional deficits. Ongoing brain activity can be recorded by electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study utilized machine-learning algorithms to categorize emotional states in PD patients compared with healthy controls (HC) using EEG. Twenty non-demented PD patients and 20 healthy age-, gender-, and education level-matched controls viewed happiness, sadness, fear, anger, surprise, and disgust emotional stimuli while fourteen-channel EEG was being recorded. Multimodal stimulus (combination of audio and visual) was used to evoke the emotions. To classify the EEG-based emotional states and visualize the changes of emotional states over time, this paper compares four kinds of EEG features for emotional state classification and proposes an approach to track the trajectory of emotion changes with manifold learning. From the experimental results using our EEG data set, we found that (a) bispectrum feature is superior to other three kinds of features, namely power spectrum, wavelet packet and nonlinear dynamical analysis; (b) higher frequency bands (alpha, beta and gamma) play a more important role in emotion activities than lower frequency bands (delta and theta) in both groups and; (c) the trajectory of emotion changes can be visualized by reducing subject-independent features with manifold learning. This provides a promising way of implementing visualization of patient's emotional state in real time and leads to a practical system for noninvasive assessment of the emotional impairments associated with neurological disorders
Emotions in context: examining pervasive affective sensing systems, applications, and analyses
Pervasive sensing has opened up new opportunities for measuring our feelings and understanding our behavior by monitoring our affective states while mobile. This review paper surveys pervasive affect sensing by examining and considering three major elements of affective pervasive systems, namely; âsensingâ, âanalysisâ, and âapplicationâ. Sensing investigates the different sensing modalities that are used in existing real-time affective applications, Analysis explores different approaches to emotion recognition and visualization based on different types of collected data, and Application investigates different leading areas of affective applications. For each of the three aspects, the paper includes an extensive survey of the literature and finally outlines some of challenges and future research opportunities of affective sensing in the context of pervasive computing
Recommended from our members
The role of HG in the analysis of temporal iteration and interaural correlation
Multi-modal Approach for Affective Computing
Throughout the past decade, many studies have classified human emotions using
only a single sensing modality such as face video, electroencephalogram (EEG),
electrocardiogram (ECG), galvanic skin response (GSR), etc. The results of
these studies are constrained by the limitations of these modalities such as
the absence of physiological biomarkers in the face-video analysis, poor
spatial resolution in EEG, poor temporal resolution of the GSR etc. Scant
research has been conducted to compare the merits of these modalities and
understand how to best use them individually and jointly. Using multi-modal
AMIGOS dataset, this study compares the performance of human emotion
classification using multiple computational approaches applied to face videos
and various bio-sensing modalities. Using a novel method for compensating
physiological baseline we show an increase in the classification accuracy of
various approaches that we use. Finally, we present a multi-modal
emotion-classification approach in the domain of affective computing research.Comment: Published in IEEE 40th International Engineering in Medicine and
Biology Conference (EMBC) 201
Teegi: Tangible EEG Interface
We introduce Teegi, a Tangible ElectroEncephaloGraphy (EEG) Interface that
enables novice users to get to know more about something as complex as brain
signals, in an easy, en- gaging and informative way. To this end, we have
designed a new system based on a unique combination of spatial aug- mented
reality, tangible interaction and real-time neurotech- nologies. With Teegi, a
user can visualize and analyze his or her own brain activity in real-time, on a
tangible character that can be easily manipulated, and with which it is
possible to interact. An exploration study has shown that interacting with
Teegi seems to be easy, motivating, reliable and infor- mative. Overall, this
suggests that Teegi is a promising and relevant training and mediation tool for
the general public.Comment: to appear in UIST-ACM User Interface Software and Technology
Symposium, Oct 2014, Honolulu, United State
Emotion Detection Using Noninvasive Low Cost Sensors
Emotion recognition from biometrics is relevant to a wide range of
application domains, including healthcare. Existing approaches usually adopt
multi-electrodes sensors that could be expensive or uncomfortable to be used in
real-life situations. In this study, we investigate whether we can reliably
recognize high vs. low emotional valence and arousal by relying on noninvasive
low cost EEG, EMG, and GSR sensors. We report the results of an empirical study
involving 19 subjects. We achieve state-of-the- art classification performance
for both valence and arousal even in a cross-subject classification setting,
which eliminates the need for individual training and tuning of classification
models.Comment: To appear in Proceedings of ACII 2017, the Seventh International
Conference on Affective Computing and Intelligent Interaction, San Antonio,
TX, USA, Oct. 23-26, 201
- âŠ