409 research outputs found

    A Review on the Computational Methods for Emotional State Estimation from the Human EEG

    Get PDF
    A growing number of affective computing researches recently developed a computer system that can recognize an emotional state of the human user to establish affective human-computer interactions. Various measures have been used to estimate emotional states, including self-report, startle response, behavioral response, autonomic measurement, and neurophysiologic measurement. Among them, inferring emotional states from electroencephalography (EEG) has received considerable attention as EEG could directly reflect emotional states with relatively low costs and simplicity. Yet, EEG-based emotional state estimation requires well-designed computational methods to extract information from complex and noisy multichannel EEG data. In this paper, we review the computational methods that have been developed to deduct EEG indices of emotion, to extract emotion-related features, or to classify EEG signals into one of many emotional states. We also propose using sequential Bayesian inference to estimate the continuous emotional state in real time. We present current challenges for building an EEG-based emotion recognition system and suggest some future directions.open

    Affect Recognition Using Electroencephalography Features

    Get PDF
    Affect is the psychological display of emotion often described with three principal dimensions: 1) valence 2) arousal and 3) dominance. This thesis work explores the ability of computers to recognize human emotions using Electroencephalography (EEG) features. The development of computer systems to classify human emotions using physiological signals has recently gained pace in the research and technological community. This is because by using EEG to analyze the cognitive state one will be able to establish a direct communication channel between a computer and the human brain. Other applications of recognizing the affective states from EEG include identifying stress and cognitive workload on individuals and assist them in relaxation. This thesis is an extensive study on the design of paradigms that help computer systems recognize emotional states given a multichannel Electroencephalogram (EEG) segment. The process of first extracting features from the EEG signals using signal processing and then constructing a predictive model via machine learning is often referred to as paradigms. In this work, we will first present a brief review of the state-of-the-art paradigms that have contributed to the topic of emotional affect recognition. Then the proposed paradigms to recognize the principal dimensions of affect are detailed. Feature selection is also performed in order to select the relevant features. The evaluation of the models created to predict the affective states will be performed quantitatively by calculating the generalization accuracy and qualitatively by interpreting them

    Automatic recognition of personality profiles using EEG functional connectivity during emotional processing

    No full text
    Personality is the characteristic set of an individual’s behavioral and emotional patterns that evolve from biological and environmental factors. The recognition of personality profiles is crucial in making human−computer interaction (HCI) applications realistic, more focused, and user friendly. The ability to recognize personality using neuroscientific data underpins the neurobiological basis of personality. This paper aims to automatically recognize personality, combining scalp electroencephalogram (EEG) and machine learning techniques. As the resting state EEG has not so far been proven efficient for predicting personality, we used EEG recordings elicited during emotion processing. This study was based on data from the AMIGOS dataset reflecting the response of 37 healthy participants. Brain networks and graph theoretical parameters were extracted from cleaned EEG signals, while each trait score was dichotomized into low- and high-level using the k-means algorithm. A feature selection algorithm was used afterwards to reduce the feature-set size to the best 10 features to describe each trait separately. Support vector machines (SVM) were finally employed to classify each instance. Our method achieved a classification accuracy of 83.8% for extraversion, 86.5% for agreeableness, 83.8% for conscientiousness, 83.8% for neuroticism, and 73% for openness

    Fronto-temporal theta phase-synchronization underlies music-evoked pleasantness

    Get PDF
    Listening to pleasant music engages a complex distributed network including pivotal areas for auditory, reward, emotional and memory processing. On the other hand, frontal theta rhythms appear to be relevant in the process of giving value to music. However, it is not clear to which extent this oscillatory mechanism underlies the brain interactions that characterize music-evoked pleasantness and its related processes. The goal of the present experiment was to study brain synchronization in this oscillatory band as a function of music-evoked pleasantness. EEG was recorded from 25 healthy subjects while they were listening to music and rating the experienced degree of induced pleasantness. By using a multilevel Bayesian approach we found that phase synchronization in the theta band between right temporal and frontal signals increased with the degree of pleasure experienced by participants. These results show that slow fronto-temporal loops play a key role in music-evoked pleasantness

    Gamma oscillations in the temporal pole reflect the contribution of approach and avoidance motivational systems to the processing of fear and anger words

    Full text link
    Prior reports suggest that affective effects in visual word processing cannot be fully explained by a dimensional perspective of emotions based on valence and arousal. In the current study, we focused on the contribution of approach and avoidance motivational systems that are related to different action components to the processing of emotional words. To this aim, we compared frontal alpha asymmetries and brain oscillations elicited by anger words associated with approach (fighting) motivational tendencies, and fear words that may trigger either avoidance (escaping), approach (fighting) or no (freezing) action tendencies. The participants’ task was to make decisions about approaching or distancing from the concepts represented by words. The results of cluster-based and beamforming analyses revealed increased gamma power band synchronization for fear words relative to anger words between 725 and 750 ms, with an estimated neural origin in the temporal pole. These findings were interpreted to reflect a conflict between different action tendencies underlying the representation of fear words in semantic and emotional memories, when trying to achieve task requirements. These results are in line with the predictions made by the fear-hinders-action hypothesis. Additionally, current data highlights the contribution of motivational features to the representation and processing of emotional wordsThis study was supported by the Ministerio de Ciencia, Innovación y Universidades of Spain (Grants PGC2018- 098558-B-I00, PID2019-107206GB-I00, and RED2018-102615- T), Ministerio de Economía y Competitividad of Spain (Grant PSI2017-84922-R), Comunidad de Madrid (Grants H2019/HUM-5705 and SI1/PJI/2019-00061), Universitat Rovira i Virgili (Grant 2019PFR-URV-B2-32). DH-P was funded by a predoctoral FPU20/03345 gran

    Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques

    Get PDF
    Recently, electroencephalogram-based emotion recognition has become crucial in enabling the Human-Computer Interaction (HCI) system to become more intelligent. Due to the outstanding applications of emotion recognition, e.g., person-based decision making, mind-machine interfacing, cognitive interaction, affect detection, feeling detection, etc., emotion recognition has become successful in attracting the recent hype of AI-empowered research. Therefore, numerous studies have been conducted driven by a range of approaches, which demand a systematic review of methodologies used for this task with their feature sets and techniques. It will facilitate the beginners as guidance towards composing an effective emotion recognition system. In this article, we have conducted a rigorous review on the state-of-the-art emotion recognition systems, published in recent literature, and summarized some of the common emotion recognition steps with relevant definitions, theories, and analyses to provide key knowledge to develop a proper framework. Moreover, studies included here were dichotomized based on two categories: i) deep learning-based, and ii) shallow machine learning-based emotion recognition systems. The reviewed systems were compared based on methods, classifier, the number of classified emotions, accuracy, and dataset used. An informative comparison, recent research trends, and some recommendations are also provided for future research directions

    Mutual Information in the Frequency Domain for Application in Biological Systems

    Get PDF
    Biological systems are comprised of multiple components that typically interact nonlinearly and produce multiple outputs (time series/signals) with specific frequency characteristics. Although the exact knowledge of the underlying mechanism remains unknown, the outputs observed from these systems can provide the dependency relations through quantitative methods and increase our understanding of the original systems. The nonlinear relations at specific frequencies require advanced dependency measures to capture the generalized interactions beyond typical correlation in the time domain or coherence in the frequency domain. Mutual information from Information Theory is such a quantity that can measure statistical dependency between random variables. Herein, we develop a model–free methodology for detection of nonlinear relations between time series with respect to frequency, that can quantify dependency under a general probabilistic framework. Classic nonlinear dynamical system and their coupled forms (Lorenz, bidirectionally coupled Lorenz, and unidirectionally coupled Macky–Glass systems) are employed to generate artificial data and to test the proposed methodology. Comparisons between the performances of this measure and a conventional linear measure are presented from applications to the artificial data. This set of results indicates that the proposed methodology is better in capturing the dependency between the variables of the systems. This measure of dependency is also applied to a real–world electrophysiological dataset for emotion analysis to study brain stimuli–response functional connectivity. The results reveal distinct brain regions and specific frequencies that are involved in emotional processing

    Connecting Brains and Bodies: Applying Physiological Computing to Support Social Interaction

    Get PDF
    Physiological and affective computing propose methods to improve human-machine interactions by adapting machines to the users' states. Recently, social signal processing (SSP) has proposed to apply similar methods to human-human interactions with the hope of better understanding and modeling social interactions. Most of the social signals employed are facial expressions, body movements and speech, but studies using physiological signals remain scarce. In this paper, we motivate the use of physiological signals in the context of social interactions. Specifically, we review studies which have investigated the relationship between various physiological indices and social interactions. We then propose two main directions to apply physiological SSP: using physiological signals of individual users as new social cues displayed in the group and using inter-user physiology to measure properties of the interactions such as conflict and social presence. We conclude that physiological measures have the potential to enhance social interactions and to connect peopl

    Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

    Get PDF
    This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments
    corecore