14 research outputs found
Subjective time is predicted by local and early visual processing
Time is as pervasive as it is elusive to study, and how the brain keeps track of millisecond time is still unclear. Here we addressed the mechanisms underlying duration perception by looking for a neural signature of subjective time distortion induced by motion adaptation. We recorded electroencephalographic signals in human partici-pants while they were asked to discriminate the duration of visual stimuli after different types of translational motion adaptation. Our results show that perceived duration can be predicted by the amplitude of the N200 event-related potential evoked by the adapted stimulus. Moreover, we show that the distortion of subjective time can be predicted by the activity in the Beta band frequency spectrum, at the offset of the adaptor and during the presentation of the subsequent adapted stimulus. Both effects were observed from posterior electrodes con-tralateral to the adapted stimulus. Overall, our findings suggest that local and low-level perceptual processes are involved in generating a subjective sense of time
Emotional states discriminated from EEG using signal complexity measures
LIST OF FIGURES 6
LIST OF TABLES 9
LIST OF ABBREVIATIONS 9
Chapter 1. General Introduction 11
1.1. Emotional states and EEG 13
EEG rhythms and emotions 13
Event-related potentials (ERPs) and emotions 15
1.2 EEG Signal Complexity and emotions 15
What is signal complexity? 15
EEG signal complexity and emotions 17
Multiscale complexity of EEG signals 18
MEMD-enhanced EEG signal complexity 21
1.3 Objectives and thesis structure 22
Chapter 2. Discrimination of video induced emotional states from EEG by using MEMD enhanced multivariate sample entropy (published) 23
Chapter 3. Discrimination of emotional states from scalp- and intracranial EEG using multiscale RĂ©nyi entropy (published) 23
Chapter 4. Autocorrentropy as a tool for discriminating emotional states from EEG 23
Chapter 5. Conclusive remarks 23
Chapter 6. Summary 23
Chapter 2 Discrimination of emotional states from EEG by using Multiscale Multivariate Sample Entropy (MMSE) 24
2.1 Introduction 24
2.2 Materials and methods 25
2.2.1 Materials 25
2.2.2 Methods 28
2.4 Discussion 37
Chapter 3. Discrimination of emotional states from scalp- and intracranial EEG using multiscale RĂ©nyi entropy 40
Abstract 40
3.1 Introduction 40
3.2. Materials and Methods 42
3.2.1 Materials 42
3.2.2 Methods 46
3.3. Results 51
3.3.1 Discriminability of emotional categories 51
3.3.2 Discriminability of multiscale RQE per electrode 53
3.3.3 Temporal evolution of multiscale RQE discriminability 55
3.3.4 Multiscale RQE of intracranial EEG 56
3.3.5 Temporal evolution of multiscale RQE for joint scalp/intracranial EEG 57
3.4. Discussion 61
Conclusion 64
Chapter 4 Autocorrentropy 64
4.1 Introduction 64
4.2. Materials and Methods 66
4.2.1 Materials 66
Stimuli 66
Subjects and data collection procedure 67
EEG data preprocessing 67
4.2.2 Methods 67
4.3 Results 68
Self reports 68
Autocorrentropy 70
Sensitivity to snippet length 71
4.4 Discussion 72
Conclusion 73
Chapter 5. Conclusive remarks 73
Benefits of using multiscale entropy 74
Impact of affect level on discriminative power 75
Comfirmation with intracranial data analysis 77
Synchrony between scalp and intracranial results 77
• Source reconstruction analysis 79
Computational complexity issues 79
Chapter 6. Summary 80
6.1 Summary 80
6.2 Samenvatting 81
APPENDICES 83
REFERENCES 92
ACKNOWLEDGEMENTS, PERSONAL CONTRIBUTION AND CONFLICT OF INTEREST STATEMENT 103
SCIENTIFIC ACKNOWLEDGEMENT 103
PERSONAL CONTRIBUTION 103
CONFLICT OF INTEREST STATEMENT 103
Curriculum Vitae 104nrpages: 107status: publishe
Discriminating multiple emotional states from EEG using a data-adaptive, multiscale information-theoretic approach
A multivariate sample entropy metric of signal complexity is applied to EEG data recorded when subjects were viewing 4 prior-labeled emotion-inducing video clips from a publically available, validated database. Besides emotion category labels, the video clips also came with arousal scores. Our subjects were also asked to provide their own emotion labels. In total 30 subjects with age range 19–70 years participated in our study. Rather than relying on predefined frequency bands, we estimate multivariate sample entropy over multiple data-driven scales using the multivariate empirical mode decomposition technique (MEMD) and show that in this way we can discriminate between 5 self-reported emotions (p < 0.05). These results could not be obtained by analyzing the relation between arousal scores and video clips, signal complexity and arousal scores, and self-reported emotions and traditional power spectral densities and their hemispheric asymmetries in the theta, alpha, beta, and gamma frequency bands. This shows that multivariate, multiscale sample entropy is a promising technique to discriminate multiple emotional states from EEG recordings.status: publishe
Predicting subject performance level From EEG signal complexity when engaged in BCI paradigm
The ability to monitor or even to predict the performance level of a subject when engaged in a cognitive task can be useful in various real-life scenarios. In this article we focus on a popular EEG-based Brain Computer Interface (BCI) paradigm and report on the complexity of the EEG signals in relation to the subject’s performance level. We estimate signal complexity with a multivariate, multiscale version of Sample Entropy (MMSE) to account for multiple temporal scales as well as within and cross-channel dependencies. Furthermore, we apply Multivariate Empirical Mode Decomposition (MEMD) to render the temporal scales data driven instead of predefined. Our pilot study shows that the multivariate entropy of EEG signals changes during the course of the experiment and that it can be used for predicting the subject’s performance level (accuracy).status: publishe
The Role of Features Types and Personalized Assessment in Detecting Affective State Using Dry Electrode EEG
Assessing the human affective state using electroencephalography (EEG) have shown good potential but failed to demonstrate reliable performance in real-life applications. Especially if one applies a setup that might impact affective processing and relies on generalized models of affect. Additionally, using subjective assessment of ones affect as ground truth has often been disputed. To shed the light on the former challenge we explored the use of a convenient EEG system with 20 participants to capture their reaction to affective movie clips in a naturalistic setting. Employing state-of-the-art machine learning approach demonstrated that the highest performance is reached when combining linear features, namely symmetry features and single-channel features, with nonlinear ones derived by a multiscale entropy approach. Nevertheless, the best performance, reflected in the highest F1-score achieved in a binary classification task for valence was 0.71 and for arousal 0.62. The performance was 10–20% better compared to using ratings provided by 13 independent raters. We argue that affective self-assessment might be underrated and it is crucial to account for personal differences in both perception and physiological response to affective cues
Clustering electrodes based on alpha statistics (main group).
<p>Clustering electrodes based on alpha statistics (main group).</p
Names, labels and affect scores of the video clips used in our main and control experiments.
<p>Name of videos between quotes and scene numbers between round brackets, standard labels between square brackets, positive or negative emotional affect scores (mean scores between round brackets), and our participants’ self-labels (underlined followed by the number of respondents between brackets). Video clips were taken from <a href="http://nemo.psp.ucl.ac.be/FilmStim/" target="_blank">http://nemo.psp.ucl.ac.be/FilmStim/</a>, standard labels and affect scores from Table 1 in [<a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0186916#pone.0186916.ref035" target="_blank">35</a>] or provided by Alexandre Schaefer (personal communication).</p
Evolution of Krippendorf’s alpha statistic of MRQE for CIMF<sub>6</sub> over entire video clips, plotted for mid-frontal electrodes.
<p>Evolution of Krippendorf’s alpha statistic of MRQE for CIMF<sub>6</sub> over entire video clips, plotted for mid-frontal electrodes.</p
Scalp distribution of <i>p</i> values of the difference in alpha coefficient between main and control groups.
<p>Scalp distribution of <i>p</i> values of the difference in alpha coefficient between main and control groups.</p
MEMD-enhanced MMRQE curves for intracranial EEG recordings.
<p>Shown are the results for the (a) amygdala- and (b) occipital implants in a patient viewing emotional and neutral video clips (blue vs. red labeled curves). Error bars correspond to standard errors of individual snippet MMRQEs.</p