511 research outputs found

    On the analysis of EEG power, frequency and asymmetry in Parkinson's disease during emotion processing

    Get PDF
    Objective: While Parkinson’s disease (PD) has traditionally been described as a movement disorder, there is growing evidence of disruption in emotion information processing associated with the disease. The aim of this study was to investigate whether there are specific electroencephalographic (EEG) characteristics that discriminate PD patients and normal controls during emotion information processing. Method: EEG recordings from 14 scalp sites were collected from 20 PD patients and 30 age-matched normal controls. Multimodal (audio-visual) stimuli were presented to evoke specific targeted emotional states such as happiness, sadness, fear, anger, surprise and disgust. Absolute and relative power, frequency and asymmetry measures derived from spectrally analyzed EEGs were subjected to repeated ANOVA measures for group comparisons as well as to discriminate function analysis to examine their utility as classification indices. In addition, subjective ratings were obtained for the used emotional stimuli. Results: Behaviorally, PD patients showed no impairments in emotion recognition as measured by subjective ratings. Compared with normal controls, PD patients evidenced smaller overall relative delta, theta, alpha and beta power, and at bilateral anterior regions smaller absolute theta, alpha, and beta power and higher mean total spectrum frequency across different emotional states. Inter-hemispheric theta, alpha, and beta power asymmetry index differences were noted, with controls exhibiting greater right than left hemisphere activation. Whereas intra-hemispheric alpha power asymmetry reduction was exhibited in patients bilaterally at all regions. Discriminant analysis correctly classified 95.0% of the patients and controls during emotional stimuli. Conclusion: These distributed spectral powers in different frequency bands might provide meaningful information about emotional processing in PD patients

    Optimal set of EEG features for emotional state classification and trajectory visualization in Parkinson's disease

    Get PDF
    In addition to classic motor signs and symptoms, individuals with Parkinson's disease (PD) are characterized by emotional deficits. Ongoing brain activity can be recorded by electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study utilized machine-learning algorithms to categorize emotional states in PD patients compared with healthy controls (HC) using EEG. Twenty non-demented PD patients and 20 healthy age-, gender-, and education level-matched controls viewed happiness, sadness, fear, anger, surprise, and disgust emotional stimuli while fourteen-channel EEG was being recorded. Multimodal stimulus (combination of audio and visual) was used to evoke the emotions. To classify the EEG-based emotional states and visualize the changes of emotional states over time, this paper compares four kinds of EEG features for emotional state classification and proposes an approach to track the trajectory of emotion changes with manifold learning. From the experimental results using our EEG data set, we found that (a) bispectrum feature is superior to other three kinds of features, namely power spectrum, wavelet packet and nonlinear dynamical analysis; (b) higher frequency bands (alpha, beta and gamma) play a more important role in emotion activities than lower frequency bands (delta and theta) in both groups and; (c) the trajectory of emotion changes can be visualized by reducing subject-independent features with manifold learning. This provides a promising way of implementing visualization of patient's emotional state in real time and leads to a practical system for noninvasive assessment of the emotional impairments associated with neurological disorders

    Inter-hemispheric EEG coherence analysis in Parkinson's disease : Assessing brain activity during emotion processing

    Get PDF
    Parkinson’s disease (PD) is not only characterized by its prominent motor symptoms but also associated with disturbances in cognitive and emotional functioning. The objective of the present study was to investigate the influence of emotion processing on inter-hemispheric electroencephalography (EEG) coherence in PD. Multimodal emotional stimuli (happiness, sadness, fear, anger, surprise, and disgust) were presented to 20 PD patients and 30 age-, education level-, and gender-matched healthy controls (HC) while EEG was recorded. Inter-hemispheric coherence was computed from seven homologous EEG electrode pairs (AF3–AF4, F7–F8, F3–F4, FC5–FC6, T7–T8, P7–P8, and O1–O2) for delta, theta, alpha, beta, and gamma frequency bands. In addition, subjective ratings were obtained for a representative of emotional stimuli. Interhemispherically, PD patients showed significantly lower coherence in theta, alpha, beta, and gamma frequency bands than HC during emotion processing. No significant changes were found in the delta frequency band coherence. We also found that PD patients were more impaired in recognizing negative emotions (sadness, fear, anger, and disgust) than relatively positive emotions (happiness and surprise). Behaviorally, PD patients did not show impairment in emotion recognition as measured by subjective ratings. These findings suggest that PD patients may have an impairment of inter-hemispheric functional connectivity (i.e., a decline in cortical connectivity) during emotion processing. This study may increase the awareness of EEG emotional response studies in clinical practice to uncover potential neurophysiologic abnormalities

    A Review on EEG Signals Based Emotion Recognition

    Get PDF
    Emotion recognition has become a very controversial issue in brain-computer interfaces (BCIs). Moreover, numerous studies have been conducted in order to recognize emotions. Also, there are several important definitions and theories about human emotions. In this paper we try to cover important topics related to the field of emotion recognition. We review several studies which are based on analyzing electroencephalogram (EEG) signals as a biological marker in emotion changes. Considering low cost, good time and spatial resolution, EEG has become very common and is widely used in most BCI applications and studies. First, we state some theories and basic definitions related to emotions. Then some important steps of an emotion recognition system like different kinds of biologic measurements (EEG, electrocardiogram [EEG], respiration rate, etc), offline vs online recognition methods, emotion stimulation types and common emotion models are described. Finally, the recent and most important studies are reviewed

    Detection of emotions in Parkinson's disease using higher order spectral features from brain's electrical activity

    Get PDF
    Non-motor symptoms in Parkinson's disease (PD) involving cognition and emotion have been progressively receiving more attention in recent times. Electroencephalogram (EEG) signals, being an activity of central nervous system, can reflect the underlying true emotional state of a person. This paper presents a computational framework for classifying PD patients compared to healthy controls (HC) using emotional information from the brain's electrical activity

    Emotion classification in Parkinson's disease by higher-order spectra and power spectrum features using EEG signals: A comparative study

    Get PDF
    Deficits in the ability to process emotions characterize several neuropsychiatric disorders and are traits of Parkinson's disease (PD), and there is need for a method of quantifying emotion, which is currently performed by clinical diagnosis. Electroencephalogram (EEG) signals, being an activity of central nervous system (CNS), can reflect the underlying true emotional state of a person. This study applied machine-learning algorithms to categorize EEG emotional states in PD patients that would classify six basic emotions (happiness and sadness, fear, anger, surprise and disgust) in comparison with healthy controls (HC). Emotional EEG data were recorded from 20 PD patients and 20 healthy age-, education level- and sex-matched controls using multimodal (audio-visual) stimuli. The use of nonlinear features motivated by the higher-order spectra (HOS) has been reported to be a promising approach to classify the emotional states. In this work, we made the comparative study of the performance of k-nearest neighbor (kNN) and support vector machine (SVM) classifiers using the features derived from HOS and from the power spectrum. Analysis of variance (ANOVA) showed that power spectrum and HOS based features were statistically significant among the six emotional states (p < 0.0001). Classification results shows that using the selected HOS based features instead of power spectrum based features provided comparatively better accuracy for all the six classes with an overall accuracy of 70.10% ± 2.83% and 77.29% ± 1.73% for PD patients and HC in beta (13-30 Hz) band using SVM classifier. Besides, PD patients achieved less accuracy in the processing of negative emotions (sadness, fear, anger and disgust) than in processing of positive emotions (happiness, surprise) compared with HC. These results demonstrate the effectiveness of applying machine learning techniques to the classification of emotional states in PD patients in a user independent manner using EEG signals. The accuracy of the system can be improved by investigating the other HOS based features. This study might lead to a practical system for noninvasive assessment of the emotional impairments associated with neurological disorders

    A Review on the Computational Methods for Emotional State Estimation from the Human EEG

    Get PDF
    A growing number of affective computing researches recently developed a computer system that can recognize an emotional state of the human user to establish affective human-computer interactions. Various measures have been used to estimate emotional states, including self-report, startle response, behavioral response, autonomic measurement, and neurophysiologic measurement. Among them, inferring emotional states from electroencephalography (EEG) has received considerable attention as EEG could directly reflect emotional states with relatively low costs and simplicity. Yet, EEG-based emotional state estimation requires well-designed computational methods to extract information from complex and noisy multichannel EEG data. In this paper, we review the computational methods that have been developed to deduct EEG indices of emotion, to extract emotion-related features, or to classify EEG signals into one of many emotional states. We also propose using sequential Bayesian inference to estimate the continuous emotional state in real time. We present current challenges for building an EEG-based emotion recognition system and suggest some future directions.open

    Finding Frustration: a Dive into the EEG of Drivers

    Get PDF
    Emotion recognition technologies for driving are increasingly used to render automotive travel more pleasurable and, more importantly, safer. Since emotions such as frustration and anger can lead to an increase in traffic accidents, this thesis explored the utility of electroencephalogram (EEG) features to recognize the driver’s frustration level. It, therefore, sought to find a balance between the ecologically valid emotion induction of a driving simulator and the noise-sensitive but highly informative measure of the EEG. Participants’ brain activity was captured with the CGX quick-30 mobile EEG system. 19 participants completed four different frustration-inducing and two baseline driving scenarios in a 360° driving simulator. Subsequently, the participants continuously rated their frustration level based on the replay of each scenario. The resulting subjective measures were used to classify EEG time periods into episodes with or without frustration. Results showed that the frequently used measure of the Alpha Asymmetry Index (AAI) had, as hypothesized, significantly more negative indices for high frustration (vs. no frustration). However, a commingling effect of anger on this result could not be dismissed. The results could not provide evidence for the yet to be replicated previous research of frustration correlates within narrow-band oscillations (delta, theta, alpha, and beta) at specified electrode positions (frontal, central, and posterior). This thesis concludes with suggestions for subsequent research endeavors and forthcoming practical implications in the form of insights acquired

    EmoEEG - recognising people's emotions using electroencephalography

    Get PDF
    Tese de mestrado integrado em Engenharia Biomédica e Biofísica (Sinais e Imagens Médicas), Universidade de Lisboa, Faculdade de Ciências, 2020As emoções desempenham um papel fulcral na vida humana, estando envolvidas numa extensa variedade de processos cognitivos, tais como tomada de decisão, perceção, interações sociais e inteligência. As interfaces cérebro-máquina (ICM) são sistemas que convertem os padrões de atividade cerebral de um utilizador em mensagens ou comandos para uma determinada aplicação. Os usos mais comuns desta tecnologia permitem que pessoas com deficiência motora controlem braços mecânicos, cadeiras de rodas ou escrevam. Contudo, também é possível utilizar tecnologias ICM para gerar output sem qualquer controle voluntário. A identificação de estados emocionais é um exemplo desse tipo de feedback. Por sua vez, esta tecnologia pode ter aplicações clínicas tais como a identificação e monitorização de patologias psicológicas, ou aplicações multimédia que facilitem o acesso a músicas ou filmes de acordo com o seu conteúdo afetivo. O interesse crescente em estabelecer interações emocionais entre máquinas e pessoas, levou à necessidade de encontrar métodos fidedignos de reconhecimento emocional automático. Os autorrelatos podem não ser confiáveis devido à natureza subjetiva das próprias emoções, mas também porque os participantes podem responder de acordo com o que acreditam que os outros responderiam. A fala emocional é uma maneira eficaz de deduzir o estado emocional de uma pessoa, pois muitas características da fala são independentes da semântica ou da cultura. No entanto, a precisão ainda é insuficiente quando comparada com outros métodos, como a análise de expressões faciais ou sinais fisiológicos. Embora o primeiro já tenha sido usado para identificar emoções com sucesso, ele apresenta desvantagens, tais como o fato de muitas expressões faciais serem "forçadas" e o fato de que as leituras só são possíveis quando o rosto do sujeito está dentro de um ângulo muito específico em relação à câmara. Por estes motivos, a recolha de sinais fisiológicos tem sido o método preferencial para o reconhecimento de emoções. O uso do EEG (eletroencefalograma) permite-nos monitorizar as emoções sentidas sob a forma de impulsos elétricos provenientes do cérebro, permitindo assim obter uma ICM para o reconhecimento afetivo. O principal objetivo deste trabalho foi estudar a combinação de diferentes elementos para identificar estados afetivos, estimando valores de valência e ativação usando sinais de EEG. A análise realizada consistiu na criação de vários modelos de regressão para avaliar como diferentes elementos afetam a precisão na estimativa de valência e ativação. Os referidos elementos foram os métodos de aprendizagem automática, o género do indivíduo, o conceito de assimetria cerebral, os canais de elétrodos utilizados, os algoritmos de extração de características e as bandas de frequências analisadas. Com esta análise foi possível criarmos o melhor modelo possível, com a combinação de elementos que maximiza a sua precisão. Para alcançar os nossos objetivos, recorremos a duas bases de dados (AMIGOS e DEAP) contendo sinais de EEG obtidos durante experiências de desencadeamento emocional, juntamente com a autoavaliação realizada pelos respetivos participantes. Nestas experiências, os participantes visionaram excertos de vídeos de conteúdo afetivo, de modo a despoletar emoções sobre eles, e depois classificaram-nas atribuindo o nível de valência e ativação experienciado. Os sinais EEG obtidos foram divididos em epochs de 4s e de seguida procedeu-se à extração de características através de diferentes algoritmos: o primeiro, segundo e terceiro parâmetros de Hjorth; entropia espectral; energia e entropia de wavelets; energia e entropia de FMI (funções de modos empíricos) obtidas através da transformada de Hilbert-Huang. Estes métodos de processamento de sinal foram escolhidos por já terem gerado resultados bons noutros trabalhos relacionados. Todos estes métodos foram aplicados aos sinais EEG dentro das bandas de frequência alfa, beta e gama, que também produziram bons resultados de acordo com trabalhos já efetuados. Após a extração de características dos sinais EEG, procedeu-se à criação de diversos modelos de estimação da valência e ativação usando as autoavaliações dos participantes como “verdade fundamental”. O primeiro conjunto de modelos criados serviu para aferir quais os melhores métodos de aprendizagem automática a utilizar para os testes vindouros. Após escolher os dois melhores, tentámos verificar as diferenças no processamento emocional entre os sexos, realizando a estimativa em homens e mulheres separadamente. O conjunto de modelos criados a seguir visou testar o conceito da assimetria cerebral, que afirma que a valência emocional está relacionada com diferenças na atividade fisiológica entre os dois hemisférios cerebrais. Para este teste específico, foram consideradas a assimetria diferencial e racional segundo pares de elétrodos homólogos. Depois disso, foram criados modelos de estimação de valência e ativação considerando cada um dos elétrodos individualmente. Ou seja, os modelos seriam gerados com todos os métodos de extração de características, mas com os dados obtidos de um elétrodo apenas. Depois foram criados modelos que visassem comparar cada um dos algoritmos de extração de características utilizados. Os modelos gerados nesta fase incluíram os dados obtidos de todos os elétrodos, já que anteriormente se verificou que não haviam elétrodos significativamente melhores que outros. Por fim, procedeu-se à criação dos modelos com a melhor combinação de elementos possível, otimizaram-se os parâmetros dos mesmos, e procurámos também aferir a sua validação. Realizámos também um processo de classificação emocional associando cada par estimado de valores de valência e ativação ao quadrante correspondente no modelo circumplexo de afeto. Este último passo foi necessário para conseguirmos comparar o nosso trabalho com as soluções existentes, pois a grande maioria delas apenas identificam o quadrante emocional, não estimando valores para a valência e ativação. Em suma, os melhores métodos de aprendizagem automática foram RF (random forest) e KNN (k-nearest neighbours), embora a combinação dos melhores métodos de extração de características fosse diferente para os dois. KNN apresentava melhor precisão considerando todos os métodos de extração menos a entropia espectral, enquanto que RF foi mais preciso considerando apenas o primeiro parâmetro de Hjorth e a energia de wavelets. Os valores dos coeficientes de Pearson obtidos para os melhores modelos otimizados ficaram compreendidos entre 0,8 e 0,9 (sendo 1 o valor máximo). Não foram registados melhoramentos nos resultados considerando cada género individualmente, pelo que os modelos finais foram criados usando os dados de todos os participantes. É possível que a diminuição da precisão dos modelos criados para cada género seja resultado da menor quantidade de dados envolvidos no processo de treino. O conceito de assimetria cerebral só foi útil nos modelos criados usando a base de dados DEAP, especialmente para a estimação de valência usando as características extraídas segundo a banda alfa. Em geral, as nossas abordagens mostraram-se a par ou mesmo superiores a outros trabalhos, obtendo-se valores de acurácia de 86.5% para o melhor modelo de classificação gerado com a base de dados AMIGOS e 86.6% usando a base de dados DEAP.Emotion recognition is a field within affective computing that is gaining increasing relevance and strives to predict an emotional state using physiological signals. Understanding how these biological factors are expressed according to one’s emotions can enhance the humancomputer interaction (HCI). This knowledge, can then be used for clinical applications such as the identification and monitoring of psychiatric disorders. It can also be used to provide better access to multimedia content, by assigning affective tags to videos or music. The goal of this work was to create several models for estimating values of valence and arousal, using features extracted from EEG signals. The different models created were meant to compare how various elements affected the accuracy of the model created. These elements were the machine learning techniques, the gender of the individual, the brain asymmetry concept, the electrode channels, the feature extraction methods and the frequency of the brain waves analysed. The final models contained the best combination of these elements and achieved PCC values over 0.80. As a way to compare our work with previous approaches, we also implemented a classification procedure to find the correspondent quadrant in the valence and arousal space according to the circumplex model of affect. The best accuracies achieved were over 86%, which was on par or even superior to some of the works already done
    corecore