1,433 research outputs found

    Measuring the Plasticity of Social Approach: A Randomized Controlled Trial of the Effects of the PEERS Intervention on EEG Asymmetry in Adolescents with Autism Spectrum Disorders

    Get PDF
    This study examined whether the Program for the Education and Enrichment of Relational Skills (PEERS: Social skills for teenagers with developmental and autism spectrum disorders: The PEERS treatment manual, Routledge, New York, 2010a) affected neural function, via EEG asymmetry, in a randomized controlled trial of adolescents with Autism spectrum disorders (ASD) and a group of typically developing adolescents. Adolescents with ASD in PEERS shifted from right-hemisphere gamma-band EEG asymmetry before PEERS to left-hemisphere EEG asymmetry after PEERS, versus a waitlist ASD group. Left-hemisphere EEG asymmetry was associated with more social contacts and knowledge, and fewer symptoms of autism. Adolescents with ASD in PEERS no longer differed from typically developing adolescents in left-dominant EEG asymmetry at post-test. These findings are discussed via the Modifier Model of Autism (Mundy et al. in Res Pract Persons Severe Disabl 32(2):124, 2007), with emphasis on remediating isolation/withdrawal in ASD

    BiTCAN: An emotion recognition network based on saliency in brain cognition

    Get PDF
    In recent years, with the continuous development of artificial intelligence and brain-computer interfaces, emotion recognition based on electroencephalogram (EEG) signals has become a prosperous research direction. Due to saliency in brain cognition, we construct a new spatio-temporal convolutional attention network for emotion recognition named BiTCAN. First, in the proposed method, the original EEG signals are de-baselined, and the two-dimensional mapping matrix sequence of EEG signals is constructed by combining the electrode position. Second, on the basis of the two-dimensional mapping matrix sequence, the features of saliency in brain cognition are extracted by using the Bi-hemisphere discrepancy module, and the spatio-temporal features of EEG signals are captured by using the 3-D convolution module. Finally, the saliency features and spatio-temporal features are fused into the attention module to further obtain the internal spatial relationships between brain regions, and which are input into the classifier for emotion recognition. Many experiments on DEAP and SEED (two public datasets) show that the accuracies of the proposed algorithm on both are higher than 97%, which is superior to most existing emotion recognition algorithms

    EEG-Based Emotion Recognition Using Regularized Graph Neural Networks

    Full text link
    Electroencephalography (EEG) measures the neuronal activities in different brain regions via electrodes. Many existing studies on EEG-based emotion recognition do not fully exploit the topology of EEG channels. In this paper, we propose a regularized graph neural network (RGNN) for EEG-based emotion recognition. RGNN considers the biological topology among different brain regions to capture both local and global relations among different EEG channels. Specifically, we model the inter-channel relations in EEG signals via an adjacency matrix in a graph neural network where the connection and sparseness of the adjacency matrix are inspired by neuroscience theories of human brain organization. In addition, we propose two regularizers, namely node-wise domain adversarial training (NodeDAT) and emotion-aware distribution learning (EmotionDL), to better handle cross-subject EEG variations and noisy labels, respectively. Extensive experiments on two public datasets, SEED and SEED-IV, demonstrate the superior performance of our model than state-of-the-art models in most experimental settings. Moreover, ablation studies show that the proposed adjacency matrix and two regularizers contribute consistent and significant gain to the performance of our RGNN model. Finally, investigations on the neuronal activities reveal important brain regions and inter-channel relations for EEG-based emotion recognition

    Deep fusion of multi-channel neurophysiological signal for emotion recognition and monitoring

    Get PDF
    How to fuse multi-channel neurophysiological signals for emotion recognition is emerging as a hot research topic in community of Computational Psychophysiology. Nevertheless, prior feature engineering based approaches require extracting various domain knowledge related features at a high time cost. Moreover, traditional fusion method cannot fully utilise correlation information between different channels and frequency components. In this paper, we design a hybrid deep learning model, in which the 'Convolutional Neural Network (CNN)' is utilised for extracting task-related features, as well as mining inter-channel and inter-frequency correlation, besides, the 'Recurrent Neural Network (RNN)' is concatenated for integrating contextual information from the frame cube sequence. Experiments are carried out in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Experimental results demonstrate that the proposed framework outperforms the classical methods, with regard to both of the emotional dimensions of Valence and Arousal

    Emotion lateralization in a graduated emotional chimeric face task: An online study

    Get PDF
    Objective: To resolve inconsistencies in the literature regarding the dominance of the right cerebral hemisphere (RH) in emotional face perception, specifically investigating the role of the intensity of emotional expressions, different emotions, and conscious perception. Method: The study used an online version of the well-established emotional chimeric face task (ECFT) in which participants judged which side of a chimeric face stimulus was more emotional. We tested the laterality bias in the ECFT across six basic emotions and experimentally modified the intensity of the emotional facial expression from neutral to fully emotional expressions, in incremental steps of 20%. Results: The results showed an overall left hemiface bias across all emotions, supporting the RH hypothesis of emotional lateralization. However, the left hemiface bias decreased with decreasing intensity of the emotional facial expression. Conclusions: The results provide further support for the RH hypothesis and suggest that the RH dominance in emotional face perception may be affected by task difficulty and visual perception strategy
    • …
    corecore