115 research outputs found

    Emotion recognition using electroencephalogram signal

    Get PDF
    Emotion play an essential role in human’s life and it is not consciously controlled. Some of the emotion can be easily expressed by facial expressions, speech, behavior and gesture but some are not. This study investigates the emotion recognition using electroencephalogram (EEG) signal. Undoubtedly, EEG signals can detect human brain activity accurately with high resolution data acquisition device as compared to other biological signals. Changes in the human brain’s electrical activity occur very quickly, thus a high resolution device is required to determine the emotion precisely. In this study, we will prove the strength and reliability of EEG signals as an emotion recognition mechanism for four different emotions which are happy, sad, fear and calm. Data of six different subjects were collected by using BrainMarker EXG device which consist of 19 channels. The pre-processing stage was performed using second order of low pass Butterworth filter to remove the unwanted signals. Then, two ranges of frequency bands were extracted from the signals which are alpha and beta. Finally, these samples will be classified using MLP Neural Network. Classification accuracy up to 91% is achieved and the average percentage of accuracy for calm, fear, happy and sad are 83.5%, 87.3%, 85.83% and 87.6% respectively. Thus, a proof of concept, this study has been capable of proposing a system of recognizing four states of emotion which are happy, sad, fear and calm by using EEG signal

    Physiological signal-based emotion recognition from wearable devices

    Get PDF
    The interest in computers recognizing human emotions has been increasing recently. Many studies have been done about recognizing emotions from physical signals such as facial expressions or from written text with good results. However, recognizing emotions from physiological signals such as heart rate, from wearable devices without physical signals have been challenging. Some studies have given good, or at least promising results. The challenge for emotion recognition is to understand how human body actually reacts to different emotional triggers and to find a common factors among people. The aim of this study is to find out whether it is possible to accurately recognize human emotions and stress from physiological signals using supervised machine learning. Further, we consider the question what type of biosignals are most informative for making such predictions. The performance of Support Vector Machines and Random Forest classifiers are experimentally evaluated on the task of separating stress and no-stress signals from three different biosignals: ECG, PPG and EDA. The challenges with these biosginals from acquiring them to pre-processing the signals are addressed and their connection to emotional experience is discussed. In addition, the challenges and problems on experimental setups used in previous studies are addressed and especially the usability problems of the dataset. The models implemented in this thesis were not able to accurately classify emotions using supervised machine learning from the dataset used. The models did not perform remarkably better than just randomly choosing labels. PPG signal however performed slightly better than ECG or EDA for stress detection

    Classification of Physiological Signals for Emotion Recognition using IoT

    Get PDF
    Emotion recognition gains huge popularity now a days. Physiological signals provides an appropriate way to detect human emotion with the help of IoT. In this paper, a novel system is proposed which is capable of determining the emotional status using physiological parameters, including design specification and software implementation of the system. This system may have a vivid use in medicine (especially for emotionally challenged people), smart home etc. Various Physiological parameters to be measured includes, heart rate (HR), galvanic skin response (GSR), skin temperature etc. To construct the proposed system the measured physiological parameters were feed to the neural networks which further classify the data in various emotional states, mainly in anger, happy, sad, joy. This work recognized the correlation between human emotions and change in physiological parameters with respect to their emotion

    Overview of Biosignal Analysis Methods for the Assessment of Stress

    Get PDF
    Objectives: Stress is a normal reaction of the human organism induced in situations that demand a level of activation. This reaction has both positive and negative impact on the life of each individual. Thus, the problem of stress management is vital for the maintenance of a person’s psychological balance. This paper aims at the brief presentation   of stress definition and various factors that can lead to augmented stress levels. Moreover, a brief synopsis of biosignals that are used for the detection and categorization of stress and their analysis is presented. Methods: Several studies, articles and reviews were included after literature research. The main questions of the research were: the most important and widely used physiological signals for stress detection/assessment, the analysis methods for their manipulation and the implementation of signal analysis for stress detection/assessment in various developed systems.  Findings: The main conclusion is that current researching approaches lead to more sophisticated methods of analysis and more accurate systems of stress detection and assessment. However, the lack of a concrete framework towards stress detection and assessment remains a great challenge for the research community. Doi: 10.28991/esj-2021-01267 Full Text: PD

    Automated Affect and Emotion Recognition from Cardiovascular Signals - A Systematic Overview Of The Field

    Get PDF
    Currently, artificial intelligence is increasingly used to recognize and differentiate emotions. Through the action of the nervous system, the heart and vascular system can respond differently depending on the type of arousal. With the growing popularity of wearable devices able to measure such signals, people may monitor their states and manage their wellness. Our goal was to explore and summarize the field of automated emotion and affect recognition from cardiovascular signals. According to our protocol, we searched electronic sources (MEDLINE, EMBASE, Web of Science, Scopus, dblp, Cochrane Library, IEEE Explore, arXiv and medRxiv) up to 31 August 2020. In the case of all identified studies, two independent reviewers were involved at each stage: screening, full-text assessment, data extraction, and quality evaluation. All conflicts were resolved during the discussion. The credibility of included studies was evaluated using a proprietary tool based on QUADAS, PROBAST. After screening 4649 references, we identified 195 eligible studies. From artificial intelligence most used methods in emotion or affect recognition were Support Vector Machines (42.86%), Neural Network (21.43%), and k-Nearest Neighbors (11.67%). Among the most explored datasets were DEAP (10.26%), MAHNOB-HCI (10.26%), AMIGOS (6.67%) and DREAMER (2.56%). The most frequent cardiovascular signals involved electrocardiogram (63.16%), photoplethysmogram (15.79%), blood volume pressure (13.16%) and heart rate (6.58%). Sadness, fear, and anger were the most examined emotions. However, there is no standard set of investigated internal feelings. On average, authors explore 4.50 states (range from 4 to 24 feelings). Research using artificial intelligence in recognizing emotions or affect using cardiovascular signals shows an upward trend. There are significant variations in the quality of the datasets, the choice of states to detect, and the classifiers used for analysis. Research project supported by program Excellence initiative - research university for the University of Science and Technology. The authors declare that they have no conflict of interest

    MINDFUL SPACE IN SENTENCES - A DATASET OF VIRTUAL EMOTIONS FOR NATURAL LANGUAGE CLASSIFICATION

    Get PDF
    Spatial emotions have played a critical role in visual-spatial environmental assessment, which can be assessed using bio-sensors and language description. However, information on virtual spatial emotion assessment with objective emotion labels and natural language processing (NLP) is insufficient in literature. Thus, designers’ ability to assess spatial design quantitatively and cost effectively is limited before the design is finalized. This research measures the emotions expressed using electroencephalograms (EEGs) and descriptions in virtual reality (VR) spaces with different parameters. First, 26 subjects experienced 10 designed virtual spaces with a VR headset (Quest 2 device) corresponding to the different space parameters of shape, height, width, and length. Simultaneously, the EEG measured the emotions of the subjects using four electrodes and the five brain waves. Second, two labels – calm and active – were produced using EEGs to describe these virtual reality spaces. Last, this labeled emotion dataset compared the differences among the virtual spaces, human feelings, and the language description of the participants in the VR spatial experience. Experimental results show that the parameter changes of VR spaces can arouse significant fluctuations in the five brain waves. The EEG brain wave signals, in turn, can label the virtual rooms with calm and active emotions. Specifically, in terms of VR spaces and emotions, the experiments find that more relative spatial height results in less active emotions, while round spaces arouse calmness in the human brain waves. Moreover, the precise connection among VR spaces, brain waves in emotion, and languages still needs further research. This research attempts to offer a useful emotion measurement tool in virtual architectural design and description using EEGs. This research identifies potentials for future applications combining physiological metrics and AI methods, i.e., machine learning for synthetic design generation and evaluation
    corecore