1,608 research outputs found

    Human emotion characterization by heart rate variability analysis guided by respiration

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksDeveloping a tool which identifies emotions based on their effect on cardiac activity may have a potential impact on clinical practice, since it may help in the diagnosing of psycho-neural illnesses. In this study, a method based on the analysis of heart rate variability (HRV) guided by respiration is proposed. The method was based on redefining the high frequency (HF) band, not only to be centered at the respiratory frequency, but also to have a bandwidth dependent on the respiratory spectrum. The method was first tested using simulated HRV signals, yielding the minimum estimation errors as compared to classical and respiratory frequency centered at HF band based definitions, independently of the values of the sympathovagal ratio. Then, the proposed method was applied to discriminate emotions in a database of video-induced elicitation. Five emotional states, relax, joy, fear, sadness and anger, were considered. The maximum correlation between HRV and respiration spectra discriminated joy vs. relax, joy vs. each negative valence emotion, and fear vs. sadness with p-value = 0.05 and AUC = 0.70. Based on these results, human emotion characterization may be improved by adding respiratory information to HRV analysis.Peer ReviewedPostprint (author's final draft

    Characterization of the autonomic nervous system response under emotional stimuli through linear and non-linear analysis of physiological signals

    Get PDF
    En esta disertación se presentan metodologías lineales y no lineales aplicadas a señales fisiológicas, con el propósito de caracterizar la respuesta del sistema nervioso autónomo bajo estímulos emocionales. Este estudio está motivado por la necesidad de desarrollar una herramienta que identifique emociones en función de su efecto sobre la actividad cardíaca, ya que puede tener un impacto potencial en la práctica clínica para diagnosticar enfermedades psico-neuronales.Las hipótesis de esta tesis doctoral son que las emociones inducen cambios notables en el sistema nervioso autónomo y que estos cambios pueden capturarse a partir del análisis de señales fisiológicas, en particular, del análisis conjunto de la variabilidad del ritmo cardíaco (HRV) y la respiración.La base de datos analizada contiene el registro simultáneo del electrocardiograma y la respiración de 25 sujetos elicitados con emociones inducidas por vídeos, incluyendo las siguientes emociones: alegría, miedo, tristeza e ira.En esta disertación se describen dos estudios metodológicos.En el primer estudio se propone un método basado en el análisis lineal de la HRV guiado por la respiración. El método se basó en la redefinición de la banda de alta frecuencia (HF), no solo centrándose en la frecuencia respiratoria, sino también considerando un ancho de banda que dependiera del espectro respiratorio. Primero, el método se validó con señales de HRV simuladas, obteniéndose errores mínimos de estimación en comparación con la definición de la banda de HF clásica e incluso con la banda de HF centrada en la frecuencia respiratoria pero con un ancho de banda constante, independientemente de los valores del ratio simpático-vagal.Después, el método propuesto se aplicó en una base de datos de elicitación emocional inducida mediante vídeos para discriminar entre emociones. No solo la banda de HF redefinida propuesta superó a las otras definiciones de banda de HF en discriminación emocional, sino también la correlación máxima entre los espectros de la HRV y de la respiración discriminó alegría y relajación, alegría y cada emoción de valencia negativa y entre miedo y tristeza con un p-valor ≤ 0.05 y AUC ≥ 0.70.En el segundo estudio, técnicas no lineales como la Función de Auto Información Mutua y la Función de Información Mutua Cruzada, AMIF y CMIF respectivamente, son también propuestas en esta tesis doctoral para el reconocimiento de emociones humanas. La técnica AMIF se aplicó a las señales de HRV para estudiar interdependencias complejas, y se consideró la técnica CMIF para cuantificar el acoplamiento complejo entre las señales de HRV y de respiración. Ambos algoritmos se adaptaron a las series temporales RR de corta duración. Las series RR fueron filtradas en las bandas de baja y alta frecuencia, y también se investigaron las series RR filtradas en un ancho de banda basado en la respiración.Los resultados revelaron que la técnica AMIF aplicada a la serie temporal RR filtrada en la banda de HF redefinida fue capaz de discriminar entre: relajación y alegría y miedo, alegría y cada valencia negativa y finalmente miedo y tristeza e ira, todos con un nivel de significación estadística (p-valor ≤ 0.05, AUC ≥ 0.70). Además, los parámetros derivados de AMIF y CMIF permitieron caracterizar la baja complejidad que la señal presentaba durante el miedo frente a cualquier otro estado emocional estudiado.Finalmente se investiga, mediante un clasificador lineal, las características lineales y no lineales que discriminan entre pares de emociones y entre valencias emocionales para determinar qué parámetros permiten diferenciar los grupos y cuántos de éstos son necesarios para lograr la mejor clasificación posible. Los resultados extraídos de este capítulo sugieren que pueden ser clasificadas mediante el análisis de la HRV: relajación y alegría, la valencia positiva frente a todas las negativas, alegría y miedo, alegría y tristeza, alegría e ira, y miedo y tristeza.El análisis conjunto de la HRV y la respiración aumenta la capacidad discriminatoria de la HRV, siendo la máxima correlación entre los espectros de la HRV y la respiración uno de los mejores índices para la discriminación de emociones. El análisis de la información mutua, aun en señales de corta duración, añade información relevante a los índices lineales para la discriminación de emociones.<br /

    Mutual information between heart rate variability and respiration for emotion characterization

    Get PDF
    Objective: Interest in emotion recognition has increased in recent years as a useful tool for diagnosing psycho-neural illnesses. In this study, the auto-mutual and the cross-mutual information function, AMIF and CMIF respectively, are used for human emotion recognition. Approach: The AMIF technique was applied to heart rate variability (HRV) signals to study complex interdependencies, and the CMIF technique was considered to quantify the complex coupling between HRV and respiratory signals. Both algorithms were adapted to short-term RR time series. Traditional band pass filtering was applied to the RR series at low frequency (LF) and high frequency (HF) bands, and a respiration-based filter bandwidth was also investigated (). Both the AMIF and the CMIF algorithms were calculated with regard to different time scales as specific complexity measures. The ability of the parameters derived from the AMIF and the CMIF to discriminate emotions was evaluated on a database of video-induced emotion elicitation. Five elicited states i.e. relax (neutral), joy (positive valence), as well as fear, sadness and anger (negative valences) were considered. Main results: The results revealed that the AMIF applied to the RR time series filtered in the band was able to discriminate between the following: relax and joy and fear, joy and each negative valence conditions, and finally fear and sadness and anger, all with a statistical significance level p¿-value 0.05, sensitivity, specificity and accuracy higher than 70% and area under the receiver operating characteristic curve index AUC 0.70. Furthermore, the parameters derived from the AMIF and the CMIF allowed the low signal complexity presented during fear to be characterized in front of any of the studied elicited states. Significance: Based on these results, human emotion manifested in the HRV and respiratory signal responses could be characterized by means of the information-content complexityPeer ReviewedPostprint (author's final draft

    Emotional State Recognition Based on Physiological Signals

    Get PDF
    Emotsionaalsete seisundite tuvastamine on väga tähtis inimese ja arvuti vahelise suhtlemise (Human-Computer Interaction, HCI) jaoks. Tänapäeval leiavad masinõppe meetodid ühe enam rakendust paljudes inimtegevuse valdkondades. Viimased uuringud näitavad, et füsioloogiliste signaalide analüüs masinõppe meetoditega võiks võimaldada inimese emotsionaalse seisundi tuvastamist hea täpsusega. Vaadates emotsionaalse sisuga videosid, või kuulates helisid, tekib inimesel spetsifiline füsiloogiline vastus. Antud uuringus me kasutame masinõpet ja heuristilist lähenemist, et tuvastada emotsionaalseid seisundeid füsioloogiliste signaalide põhjal. Meetodite võrdlus näitas, et kõrgeim täpsus saavutati juhuslike metsade (Random Forest) meetodiga rakendades seda EEG signaalile, mis teisendati sagedusintervallideks. Ka kombineerides EEG-d teiste füsioloogiliste signaalidega oli tuvastamise täpsus suhteliselt kõrge. Samas heuristilised meetodid ja EEG signaali klassifitseerimise rekurrentse närvivõrkude abil ebaõnnestusid. Andmeallikaks oli MAHNOB-HCI mitmemodaalne andmestik, mis koosneb 27 isikult kogutud füsioloogilistest signaalidest, kus igaüks neist vaatas 20 emotsionaalset videolõiku. Ootamatu tulemusena saime teada, et klassikaline Eckman'i emotsionaalsete seisundite nimekiri oli parem emotsioonide kirjeldamiseks ja klassifitseerimiseks kui kaasaegne mudel, mis esitab emotsioone valentsuse ja ärrituse teljestikul. Meie töö näitab, et emotsiooni märgistamise meetod on väga tähtis hea klassifitseerimismudeli loomiseks, ning et kasutatav andmestik peab sobima masinõppe meetodite jaoks. Saadud tulemused võivad aidata valida õigeid füsioloogilisi signaale ja emotsioonide märkimise meetodeid uue andmestiku loomisel ja töötlemisel.Emotional state recognition is a crucial task for achieving a new level of Human-Computer Interaction (HCI). Machine Learning applications penetrate more and more spheres of everyday life. Recent studies are showing promising results in analyzing physiological signals (EEG, ECG, GSR) using Machine Learning for accessing emotional state. Commonly, specific emotion is invoked by playing affective videos or sounds. However, there is no canonical way for emotional state interpretation. In this study, we classified affective physiological signals with labels obtained from two emotional state estimation approaches using machine learning algorithms and heuristic formulas. Comparison of the method has shown that the highest accuracy was achieved using Random Forest classifier on spectral features from the EEG records, a combination of features for the peripheral physiological signal also shown relatively high classification performance. However, heuristic formulas and novel approach for ECG signal classification using recurrent neural network ultimately failed. Data was taken from the MAHNOB-HCI dataset which is a multimodal database collected on 27 subjects by showing 20 emotional movie fragment`s. We obtained an unexpected result, that description of emotional states using discrete Eckman's paradigm provides better classification results comparing to the contemporary dimensional model which represents emotions by matching them onto the Cartesian plane with valence and arousal axis. Our study shows the importance of label selection in emotion recognition task. Moreover, obtained dataset have to be suitable for Machine Learning algorithms. Acquired results may help to select proper physiological signals and emotional labels for further dataset creation and post-processing

    Neurophysiological Assessment of Affective Experience

    Get PDF
    In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. The automatic recognition of the affective state, or emotion, of the user is one of the big challenges. In this proposal I focus on the affect recognition via physiological and neurophysiological signals. Long‐standing evidence from psychophysiological research and more recently from research in affective neuroscience suggests that both, body and brain physiology, are able to indicate the current affective state of a subject. However, regarding the classification of AX several questions are still unanswered. The principal possibility of AX classification was repeatedly shown, but its generalisation over different task contexts, elicitating stimuli modalities, subjects or time is seldom addressed. In this proposal I will discuss a possible agenda for the further exploration of physiological and neurophysiological correlates of AX over different elicitation modalities and task contexts

    Fusion of musical contents, brain activity and short term physiological signals for music-emotion recognition

    Get PDF
    In this study we propose a multi-modal machine learning approach, combining EEG and Audio features for music emotion recognition using a categorical model of emotions. The dataset used consists of film music that was carefully created to induce strong emotions. Five emotion categories were adopted: Fear, Anger, Happy, Tender and Sad. EEG data was obtained from three male participants listening to the labeled music excerpts. Feature level fusion was adopted to combine EEG and Audio features. The results show that the multimodal system outperformed the EEG mono modal system. Additionally, we evaluated the contribution of each audio feature in the classification performance of the multimodal system. Preliminary results indicate a significant contribution of individual audio features in the classification accuracy, we also found that various audio features that noticeably contributed in the classification accuracy were also reported in previous research studying the correlation between audio features and emotion ratings using the same dataset.

    Physiological Signal Processing in Heart Rate Variability Measurement: A Focus on Spectral Analysis

    Full text link
    Fast Fourier Transform (FFT) relies on the HRV frequency-domain analysis techniques. It requires re-sampling of the inherently unevenly sampled heartbeat time-series (RR tachogram) to produce an evenly sampled time series of the heartbeat. However, re-sampling of the heartbeat time -- series is found to produce a substantial error when estimating an artificial RR tachogram
    corecore