1,513 research outputs found

    Neurophysiological Assessment of Affective Experience

    Get PDF
    In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. The automatic recognition of the affective state, or emotion, of the user is one of the big challenges. In this proposal I focus on the affect recognition via physiological and neurophysiological signals. Long‐standing evidence from psychophysiological research and more recently from research in affective neuroscience suggests that both, body and brain physiology, are able to indicate the current affective state of a subject. However, regarding the classification of AX several questions are still unanswered. The principal possibility of AX classification was repeatedly shown, but its generalisation over different task contexts, elicitating stimuli modalities, subjects or time is seldom addressed. In this proposal I will discuss a possible agenda for the further exploration of physiological and neurophysiological correlates of AX over different elicitation modalities and task contexts

    Differences in Brain Waves and Blood Pressure by Listening to Quran-e-Kareem and Music

    Get PDF
    Background: Quranic recitation and music do not share any features in terms of content besides the use of melodies, but it is a common belief, that both have positive effect on reducing blood pressure and anxiety level of patients. This research investigates and compares the effects of listening to Quranic recitation and soft music on human brain waves especially Alpha and Beta waves by electroencephalogram (EEG) using Power-Lab.Material and Methods: A clinical trial was carried out in the Physiology Department of Islamabad Medical and Dental College. There were 22 participants, divided into two groups (A and B) with 11 participants in each group. Group A included students with ages 20-25 years and Group B comprised of teaching faculty between 40-60 years. All the study participants were Urdu-speaking, Pakistani Muslims having normal hearing. Sample selection was based on non-random convenient sampling. Paired T-test was used to compare means of Alpha and Beta waves amplitude, with p value < 0.05 considered as statistically significant.Results: Listening to Quranic recitation results in greater amplitude of Alpha waves in both younger and older age groups (p=0.01). The cross comparisons of systolic blood pressure at rest and after music for Group A showed significant results (p=0.04) indicating that soft music increases systolic blood pressure in younger people. Diastolic blood pressure comparison proves that it decreases by Tilawat in older age-groups (p<0.05).Conclusion: EEG showed that Quran generates comparatively higher amplitudes of Alpha than Beta waves, which reflects the calmness and relaxation of the participants while listening to Quranic recitation. Furthermore, there was a mild reduction in diastolic blood pressure in older subjects after listening to Quranic recitation

    Multimodal assessment of emotional responses by physiological monitoring: novel auditory and visual elicitation strategies in traditional and virtual reality environments

    Get PDF
    This doctoral thesis explores novel strategies to quantify emotions and listening effort through monitoring of physiological signals. Emotions are a complex aspect of the human experience, playing a crucial role in our survival and adaptation to the environment. The study of emotions fosters important applications, such as Human-Computer and Human-Robot interaction or clinical assessment and treatment of mental health conditions such as depression, anxiety, stress, chronic anger, and mood disorders. Listening effort is also an important area of study, as it provides insight into the listeners’ challenges that are usually not identified by traditional audiometric measures. The research is divided into three lines of work, each with a unique emphasis on the methods of emotion elicitation and the stimuli that are most effective in producing emotional responses, with a specific focus on auditory stimuli. The research fostered the creation of three experimental protocols, as well as the use of an available online protocol for studying emotional responses including monitoring of both peripheral and central physiological signals, such as skin conductance, respiration, pupil dilation, electrocardiogram, blood volume pulse, and electroencephalography. An emotional protocol was created for the study of listening effort using a speech-in-noise test designed to be short and not induce fatigue. The results revealed that the listening effort is a complex problem that cannot be studied with a univariate approach, thus necessitating the use of multiple physiological markers to study different physiological dimensions. Specifically, the findings demonstrate a strong association between the level of auditory exertion, the amount of attention and involvement directed towards stimuli that are readily comprehensible compared to those that demand greater exertion. Continuing with the auditory domain, peripheral physiological signals were studied in order to discriminate four emotions elicited in a subject who listened to music for 21 days, using a previously designed and publicly available protocol. Surprisingly, the processed physiological signals were able to clearly separate the four emotions at the physiological level, demonstrating that music, which is not typically studied extensively in the literature, can be an effective stimulus for eliciting emotions. Following these results, a flat-screen protocol was created to compare physiological responses to purely visual, purely auditory, and combined audiovisual emotional stimuli. The results show that auditory stimuli are more effective in separating emotions at the physiological level. The subjects were found to be much more attentive during the audio-only phase. In order to overcome the limitations of emotional protocols carried out in a laboratory environment, which may elicit fewer emotions due to being an unnatural setting for the subjects under study, a final emotional elicitation protocol was created using virtual reality. Scenes similar to reality were created to elicit four distinct emotions. At the physiological level, it was noted that this environment is more effective in eliciting emotions. To our knowledge, this is the first protocol specifically designed for virtual reality that elicits diverse emotions. Furthermore, even in terms of classification, the use of virtual reality has been shown to be superior to traditional flat-screen protocols, opening the doors to virtual reality for the study of conditions related to emotional control

    Fusion of musical contents, brain activity and short term physiological signals for music-emotion recognition

    Get PDF
    In this study we propose a multi-modal machine learning approach, combining EEG and Audio features for music emotion recognition using a categorical model of emotions. The dataset used consists of film music that was carefully created to induce strong emotions. Five emotion categories were adopted: Fear, Anger, Happy, Tender and Sad. EEG data was obtained from three male participants listening to the labeled music excerpts. Feature level fusion was adopted to combine EEG and Audio features. The results show that the multimodal system outperformed the EEG mono modal system. Additionally, we evaluated the contribution of each audio feature in the classification performance of the multimodal system. Preliminary results indicate a significant contribution of individual audio features in the classification accuracy, we also found that various audio features that noticeably contributed in the classification accuracy were also reported in previous research studying the correlation between audio features and emotion ratings using the same dataset.

    Emotion and Stress Recognition Related Sensors and Machine Learning Technologies

    Get PDF
    This book includes impactful chapters which present scientific concepts, frameworks, architectures and ideas on sensing technologies and machine learning techniques. These are relevant in tackling the following challenges: (i) the field readiness and use of intrusive sensor systems and devices for capturing biosignals, including EEG sensor systems, ECG sensor systems and electrodermal activity sensor systems; (ii) the quality assessment and management of sensor data; (iii) data preprocessing, noise filtering and calibration concepts for biosignals; (iv) the field readiness and use of nonintrusive sensor technologies, including visual sensors, acoustic sensors, vibration sensors and piezoelectric sensors; (v) emotion recognition using mobile phones and smartwatches; (vi) body area sensor networks for emotion and stress studies; (vii) the use of experimental datasets in emotion recognition, including dataset generation principles and concepts, quality insurance and emotion elicitation material and concepts; (viii) machine learning techniques for robust emotion recognition, including graphical models, neural network methods, deep learning methods, statistical learning and multivariate empirical mode decomposition; (ix) subject-independent emotion and stress recognition concepts and systems, including facial expression-based systems, speech-based systems, EEG-based systems, ECG-based systems, electrodermal activity-based systems, multimodal recognition systems and sensor fusion concepts and (x) emotion and stress estimation and forecasting from a nonlinear dynamical system perspective

    VOLUNTARY CONTROL OF BREATHING ACCORDING TO THE BREATHING PATTERN DURING LISTENING TO MUSIC AND NON-CONTACT MEASUREMENT OF HEART RATE AND RESPIRATION

    Get PDF
    We investigated if listening to songs changes breathing pattern which changes autonomic responses such as heart rate (HR) and heart rate variability (HRV) or change in breathing pattern is a byproduct of listening to songs or change in breathing pattern as well as listening to songs causes changes in autonomic responses. Seven subjects (4 males and 3 females) participated in a pilot study where they listened to two types of songs and used a custom developed biofeedback program to control their breathing pattern to match the one recorded during listening to the songs. Coherencies between EEG, breathing pattern and RR intervals (RRI) were calculated to study the interaction with neural responses. Trends in HRV varied only during listening to songs, suggesting that autonomic response was affected by listening to songs irrespective of control of breathing. Effective coherence during songs while spontaneously breathing was more than during silence and during control of breathing. These results, although preliminary, suggest that listening to songs as well as change in breathing patterns changes the autonomic response but the effect of listening to songs may surpass the effect of changes in breathing. We explored feasibility of using non-contact measurements of HR and breathing rate (BR) by using recently developed Facemesh and other methods for tracking regions of interests from videos of faces of subjects. Performance was better for BR than HR, and over currently used methods. However, refinement of the approach would be needed to get the precision required for detecting subtle changes

    Neurophysiological Effects of Harmonisation: The Effect of Harmonisation on Heart Rate Variability, Respiratory Rate and Electroencephalograph

    Get PDF
     Harmonisation is a practice whereby the harmoniser, who is centered in silent prayer, opens and nourishes the subject's chakras, using touch. This technique has been widely used since 1933, with substantial anecdotal evidence about its benefits, but no published, peer-reviewed data. This preliminary study aimed to discover if standard physiological measuring techniques can detect any significant changes in the central and autonomic nervous systems and the cardiopulmonary system during harmonisation. A simple, comparative design was used, with one experimental group of 20 self-selecting, healthy women, naive to harmonisation. The results were compared with reference data, matched for age and gender, from non-intervention control studies conducted by the same experimenters in the same neurophysiological laboratory. An 3D-minute recording session determined baseline, intervention and stabilization measurements of electroencephalographic, electrocardiographic, and respiratory data. A significant lowering of brain activity was found during the opening phase of harmonisation, implying a state of increased mental focus coupled with a sense of calmness and relaxation, while significant changes to heart beatlrespiration ratios were observed during the nourishing phase. This suggests that different physiological processes affecting the central and autonomic nervous systems and the cardiopulmonary system may occur during different phases of harmonisation

    Approaches, applications, and challenges in physiological emotion recognition — a tutorial overview

    Get PDF
    An automatic emotion recognition system can serve as a fundamental framework for various applications in daily life from monitoring emotional well-being to improving the quality of life through better emotion regulation. Understanding the process of emotion manifestation becomes crucial for building emotion recognition systems. An emotional experience results in changes not only in interpersonal behavior but also in physiological responses. Physiological signals are one of the most reliable means for recognizing emotions since individuals cannot consciously manipulate them for a long duration. These signals can be captured by medical-grade wearable devices, as well as commercial smart watches and smart bands. With the shift in research direction from laboratory to unrestricted daily life, commercial devices have been employed ubiquitously. However, this shift has introduced several challenges, such as low data quality, dependency on subjective self-reports, unlimited movement-related changes, and artifacts in physiological signals. This tutorial provides an overview of practical aspects of emotion recognition, such as experiment design, properties of different physiological modalities, existing datasets, suitable machine learning algorithms for physiological data, and several applications. It aims to provide the necessary psychological and physiological backgrounds through various emotion theories and the physiological manifestation of emotions, thereby laying a foundation for emotion recognition. Finally, the tutorial discusses open research directions and possible solutions

    Neurophysiological Effects of Harmonisation: The Effect of Harmonisation on Heart Rate Variability, Respiratory Rate and Electroencephalograph

    Get PDF
     Harmonisation is a practice whereby the harmoniser, who is centered in silent prayer, opens and nourishes the subject's chakras, using touch. This technique has been widely used since 1933, with substantial anecdotal evidence about its benefits, but no published, peer-reviewed data. This preliminary study aimed to discover if standard physiological measuring techniques can detect any significant changes in the central and autonomic nervous systems and the cardiopulmonary system during harmonisation. A simple, comparative design was used, with one experimental group of 20 self-selecting, healthy women, naive to harmonisation. The results were compared with reference data, matched for age and gender, from non-intervention control studies conducted by the same experimenters in the same neurophysiological laboratory. An 3D-minute recording session determined baseline, intervention and stabilization measurements of electroencephalographic, electrocardiographic, and respiratory data. A significant lowering of brain activity was found during the opening phase of harmonisation, implying a state of increased mental focus coupled with a sense of calmness and relaxation, while significant changes to heart beatlrespiration ratios were observed during the nourishing phase. This suggests that different physiological processes affecting the central and autonomic nervous systems and the cardiopulmonary system may occur during different phases of harmonisation
    • 

    corecore