15 research outputs found

    Emotion Recognition with Machine Learning Using EEG Signals

    Full text link
    In this research, an emotion recognition system is developed based on valence/arousal model using electroencephalography (EEG) signals. EEG signals are decomposed into the gamma, beta, alpha and theta frequency bands using discrete wavelet transform (DWT), and spectral features are extracted from each frequency band. Principle component analysis (PCA) is applied to the extracted features by preserving the same dimensionality, as a transform, to make the features mutually uncorrelated. Support vector machine (SVM), K-nearest neighbor (KNN) and artificial neural network (ANN) are used to classify emotional states. The cross-validated SVM with radial basis function (RBF) kernel using extracted features of 10 EEG channels, performs with 91.3% accuracy for arousal and 91.1% accuracy for valence, both in the beta frequency band. Our approach shows better performance compared to existing algorithms applied to the "DEAP" dataset

    Deep fusion of multi-channel neurophysiological signal for emotion recognition and monitoring

    Get PDF
    How to fuse multi-channel neurophysiological signals for emotion recognition is emerging as a hot research topic in community of Computational Psychophysiology. Nevertheless, prior feature engineering based approaches require extracting various domain knowledge related features at a high time cost. Moreover, traditional fusion method cannot fully utilise correlation information between different channels and frequency components. In this paper, we design a hybrid deep learning model, in which the 'Convolutional Neural Network (CNN)' is utilised for extracting task-related features, as well as mining inter-channel and inter-frequency correlation, besides, the 'Recurrent Neural Network (RNN)' is concatenated for integrating contextual information from the frame cube sequence. Experiments are carried out in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Experimental results demonstrate that the proposed framework outperforms the classical methods, with regard to both of the emotional dimensions of Valence and Arousal

    EEG-Based Emotion Recognition Using Regularized Graph Neural Networks

    Full text link
    Electroencephalography (EEG) measures the neuronal activities in different brain regions via electrodes. Many existing studies on EEG-based emotion recognition do not fully exploit the topology of EEG channels. In this paper, we propose a regularized graph neural network (RGNN) for EEG-based emotion recognition. RGNN considers the biological topology among different brain regions to capture both local and global relations among different EEG channels. Specifically, we model the inter-channel relations in EEG signals via an adjacency matrix in a graph neural network where the connection and sparseness of the adjacency matrix are inspired by neuroscience theories of human brain organization. In addition, we propose two regularizers, namely node-wise domain adversarial training (NodeDAT) and emotion-aware distribution learning (EmotionDL), to better handle cross-subject EEG variations and noisy labels, respectively. Extensive experiments on two public datasets, SEED and SEED-IV, demonstrate the superior performance of our model than state-of-the-art models in most experimental settings. Moreover, ablation studies show that the proposed adjacency matrix and two regularizers contribute consistent and significant gain to the performance of our RGNN model. Finally, investigations on the neuronal activities reveal important brain regions and inter-channel relations for EEG-based emotion recognition

    Classification of Alzheimers Disease with Deep Learning on Eye-tracking Data

    Full text link
    Existing research has shown the potential of classifying Alzheimers Disease (AD) from eye-tracking (ET) data with classifiers that rely on task-specific engineered features. In this paper, we investigate whether we can improve on existing results by using a Deep-Learning classifier trained end-to-end on raw ET data. This classifier (VTNet) uses a GRU and a CNN in parallel to leverage both visual (V) and temporal (T) representations of ET data and was previously used to detect user confusion while processing visual displays. A main challenge in applying VTNet to our target AD classification task is that the available ET data sequences are much longer than those used in the previous confusion detection task, pushing the limits of what is manageable by LSTM-based models. We discuss how we address this challenge and show that VTNet outperforms the state-of-the-art approaches in AD classification, providing encouraging evidence on the generality of this model to make predictions from ET data.Comment: ICMI 2023 long pape

    Survey on encode biometric data for transmission in wireless communication networks

    Get PDF
    The aim of this research survey is to review an enhanced model supported by artificial intelligence to encode biometric data for transmission in wireless communication networks can be tricky as performance decreases with increasing size due to interference, especially if channels and network topology are not selected carefully beforehand. Additionally, network dissociations may occur easily if crucial links fail as redundancy is neglected for signal transmission. Therefore, we present several algorithms and its implementation which addresses this problem by finding a network topology and channel assignment that minimizes interference and thus allows a deployment to increase its throughput performance by utilizing more bandwidth in the local spectrum by reducing coverage as well as connectivity issues in multiple AI-based techniques. Our evaluation survey shows an increase in throughput performance of up to multiple times or more compared to a baseline scenario where an optimization has not taken place and only one channel for the whole network is used with AI-based techniques. Furthermore, our solution also provides a robust signal transmission which tackles the issue of network partition for coverage and for single link failures by using airborne wireless network. The highest end-to-end connectivity stands at 10 Mbps data rate with a maximum propagation distance of several kilometers. The transmission in wireless network coverage depicted with several signal transmission data rate with 10 Mbps as it has lowest coverage issue with moderate range of propagation distance using enhanced model to encode biometric data for transmission in wireless communication

    Design of User-Customized Negative Emotion Classifier Based on Feature Selection Using Physiological Signal Sensors

    Get PDF
    First, the Likert scale and self-assessment manikin are used to provide emotion analogies, but they have limits for reflecting subjective factors. To solve this problem, we use physiological signals that show objective responses from cognitive status. The physiological signals used are electrocardiogram, skin temperature, and electrodermal activity (EDA). Second, the degree of emotion felt, and the related physiological signals, vary according to the individual. KLD calculates the difference in probability distribution shape patterns between two classes. Therefore, it is possible to analyze the relationship between physiological signals and emotion. As the result, features from EDA are important for distinguishing negative emotion in all subjects. In addition, the proposed feature selection algorithm showed an average accuracy of 92.5% and made it possible to improve the accuracy of negative emotion recognition.ope

    Deep learning-based EEG emotion recognition: Current trends and future perspectives

    Get PDF
    Automatic electroencephalogram (EEG) emotion recognition is a challenging component of human–computer interaction (HCI). Inspired by the powerful feature learning ability of recently-emerged deep learning techniques, various advanced deep learning models have been employed increasingly to learn high-level feature representations for EEG emotion recognition. This paper aims to provide an up-to-date and comprehensive survey of EEG emotion recognition, especially for various deep learning techniques in this area. We provide the preliminaries and basic knowledge in the literature. We review EEG emotion recognition benchmark data sets briefly. We review deep learning techniques in details, including deep belief networks, convolutional neural networks, and recurrent neural networks. We describe the state-of-the-art applications of deep learning techniques for EEG emotion recognition in detail. We analyze the challenges and opportunities in this field and point out its future directions

    Noise Reduction of EEG Signals Using Autoencoders Built Upon GRU based RNN Layers

    Get PDF
    Understanding the cognitive and functional behaviour of the brain by its electrical activity is an important area of research. Electroencephalography (EEG) is a method that measures and record electrical activities of the brain from the scalp. It has been used for pathology analysis, emotion recognition, clinical and cognitive research, diagnosing various neurological and psychiatric disorders and for other applications. Since the EEG signals are sensitive to activities other than the brain ones, such as eye blinking, eye movement, head movement, etc., it is not possible to record EEG signals without any noise. Thus, it is very important to use an efficient noise reduction technique to get more accurate recordings. Numerous traditional techniques such as Principal Component Analysis (PCA), Independent Component Analysis (ICA), wavelet transformations and machine learning techniques were proposed for reducing the noise in EEG signals. The aim of this paper is to investigate the effectiveness of stacked autoencoders built upon Gated Recurrent Unit (GRU) based Recurrent Neural Network (RNN) layers (GRU-AE) against PCA. To achieve this, Harrell-Davis decile values for the reconstructed signals’ signal-to- noise ratio distributions were compared and it was found that the GRU-AE outperformed PCA for noise reduction of EEG signals
    corecore