2 research outputs found

    Subject Independent Emotion Recognition using EEG Signals Employing Attention Driven Neural Networks

    Full text link
    In the recent past, deep learning-based approaches have significantly improved the classification accuracy when compared to classical signal processing and machine learning based frameworks. But most of them were subject-dependent studies which were not able to generalize on the subject-independent tasks due to the inter-subject variability present in EEG data. In this work, a novel deep learning framework capable of doing subject-independent emotion recognition is presented, consisting of two parts. First, an unsupervised Long Short-Term Memory (LSTM) with channel-attention autoencoder is proposed for getting a subject-invariant latent vector subspace i.e., intrinsic variables present in the EEG data of each individual. Secondly, a convolutional neural network (CNN) with attention framework is presented for performing the task of subject-independent emotion recognition on the encoded lower dimensional latent space representations obtained from the proposed LSTM with channel-attention autoencoder. With the attention mechanism, the proposed approach could highlight the significant time-segments of the EEG signal, which contributes to the emotion under consideration as validated by the results. The proposed approach has been validated using publicly available datasets for EEG signals such as DEAP dataset, SEED dataset and CHB-MIT dataset. The proposed end-to-end deep learning framework removes the requirement of different hand engineered features and provides a single comprehensive task agnostic EEG analysis tool capable of performing various kinds of EEG analysis on subject independent data.Comment: Under Review in Elsevier Biomedical Signal Processing and Contro

    Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark

    Full text link
    Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance. In this paper, we address this issue for quadratic-cost transport -- specifically, computation of the Wasserstein-2 distance, a commonly-used formulation of optimal transport in machine learning. To overcome the challenge of computing ground truth transport maps between continuous measures needed to assess these solvers, we use input-convex neural networks (ICNN) to construct pairs of measures whose ground truth OT maps can be obtained analytically. This strategy yields pairs of continuous benchmark measures in high-dimensional spaces such as spaces of images. We thoroughly evaluate existing optimal transport solvers using these benchmark measures. Even though these solvers perform well in downstream tasks, many do not faithfully recover optimal transport maps. To investigate the cause of this discrepancy, we further test the solvers in a setting of image generation. Our study reveals crucial limitations of existing solvers and shows that increased OT accuracy does not necessarily correlate to better results downstream
    corecore