58 research outputs found

    Feature extraction based on bio-inspired model for robust emotion recognition

    Get PDF
    Emotional state identification is an important issue to achieve more natural speech interactive systems. Ideally, these systems should also be able to work in real environments in which generally exist some kind of noise. Several bio-inspired representations have been applied to artificial systems for speech processing under noise conditions. In this work, an auditory signal representation is used to obtain a novel bio-inspired set of features for emotional speech signals. These characteristics, together with other spectral and prosodic features, are used for emotion recognition under noise conditions. Neural models were trained as classifiers and results were compared to the well-known mel-frequency cepstral coefficients. Results show that using the proposed representations, it is possible to significantly improve the robustness of an emotion recognition system. The results were also validated in a speaker independent scheme and with two emotional speech corpora.Fil: Albornoz, Enrique Marcelo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Milone, Diego Humberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Rufiner, Hugo Leonardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; Argentin

    Quantification of vascular function changes under different emotion states: A pilot study

    Get PDF
    Recent studies have indicated that physiological parameters change with different emotion states. This study aimed to quantify the changes of vascular function at different emotion and sub-emotion states. Twenty young subjects were studied with their finger photoplethysmographic (PPG) pulses recorded at three distinct emotion states: natural (1 minute), happiness and sadness (10 minutes for each). Within the period of happiness and sadness emotion states, two sub-emotion states (calmness and outburst) were identified with the synchronously recorded videos. Reflection index (RI) and stiffness index (SI), two widely used indices of vascular function, were derived from the PPG pulses to quantify their differences between three emotion states, as well as between two sub-emotion states. The results showed that, when compared with the natural emotion, RI and SI decreased in both happiness and sadness emotions. The decreases in RI were significant for both happiness and sadness emotions (both P< 0.01), but the decreases in SI was only significant for sadness emotion (P< 0.01). Moreover, for comparing happiness and sadness emotions, there was significant difference in RI (P< 0.01), but not in SI (P= 0.9). In addition, significant larger RI values were observed with the outburst sub-emotion in comparison with the calmness one for both happiness and sadness emotions (both P< 0.01) whereas significant larger SI values were observed with the outburst sub-emotion only in sadness emotion (P< 0.05). Moreover, gender factor hardly influence the RI and SI results for all three emotion measurements. This pilot study confirmed that vascular function changes with diffenrt emotion states could be quantified by the simple PPG measurement

    Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics

    Get PDF
    Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis

    Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors

    Get PDF
    [EN] Affective Computing has emerged as an important field of study that aims to develop systems that can automatically recognize emotions. Up to the present, elicitation has been carried out with nonimmersive stimuli. This study, on the other hand, aims to develop an emotion recognition system for affective states evoked through Immersive Virtual Environments. Four alternative virtual rooms were designed to elicit four possible arousal-valence combinations, as described in each quadrant of the Circumplex Model of Affects. An experiment involving the recording of the electroencephalography (EEG) and electrocardiography (ECG) of sixty participants was carried out. A set of features was extracted from these signals using various state-of-the-art metrics that quantify brain and cardiovascular linear and nonlinear dynamics, which were input into a Support Vector Machine classifier to predict the subject's arousal and valence perception. The model's accuracy was 75.00% along the arousal dimension and 71.21% along the valence dimension. Our findings validate the use of Immersive Virtual Environments to elicit and automatically recognize different emotional states from neural and cardiac dynamics; this development could have novel applications in fields as diverse as Architecture, Health, Education and Videogames.This work was supported by the Ministerio de Economia y Competitividad. Spain (Project TIN2013-45736-R).Marín-Morales, J.; Higuera-Trujillo, JL.; Greco, A.; Guixeres Provinciale, J.; Llinares Millán, MDC.; Scilingo, EP.; Alcañiz Raya, ML.... (2018). Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Scientific Reports. 8:1-15. https://doi.org/10.1038/s41598-018-32063-4S1158Picard, R. W. Affective computing. (MIT press, 1997).Picard, R. W. Affective Computing: Challenges. Int. J. Hum. Comput. Stud. 59, 55–64 (2003).Jerritta, S., Murugappan, M., Nagarajan, R. & Wan, K. Physiological signals based human emotion Recognition: a review. Signal Process. its Appl. (CSPA), 2011 IEEE 7th Int. Colloq. 410–415, https://doi.org/10.1109/CSPA.2011.5759912 (2011).Harms, M. B., Martin, A. & Wallace, G. L. Facial emotion recognition in autism spectrum disorders: A review of behavioral and neuroimaging studies. Neuropsychol. Rev. 20, 290–322 (2010).Koolagudi, S. G. & Rao, K. S. Emotion recognition from speech: A review. Int. J. Speech Technol. 15, 99–117 (2012).Gross, J. J. & Levenson, R. W. Emotion elicitation using films. Cogn. Emot. 9, 87–108 (1995).Lindal, P. J. & Hartig, T. Architectural variation, building height, and the restorative quality of urban residential streetscapes. J. Environ. Psychol. 33, 26–36 (2013).Ulrich, R. View through a window may influence recovery from surgery. Science (80-.). 224, 420–421 (1984).Fernández-Caballero, A. et al. Smart environment architecture for emotion detection and regulation. J. Biomed. Inform. 64, 55–73 (2016).Ekman, P. Basic Emotions. Handbook of cognition and emotion 45–60, https://doi.org/10.1017/S0140525X0800349X (1999).Posner, J., Russell, J. A. & Peterson, B. S. The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17, 715–34 (2005).Russell, J. A. & Mehrabian, A. Evidence for a three-factor theory of emotions. J. Res. Pers. 11, 273–294 (1977).Calvo, R. A. & D’Mello, S. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1, 18–37 (2010).Valenza, G. et al. Combining electroencephalographic activity and instantaneous heart rate for assessing brain–heart dynamics during visual emotional elicitation in healthy subjects. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 374, 20150176 (2016).Valenza, G., Lanata, A. & Scilingo, E. P. The role of nonlinear dynamics in affective valence and arousal recognition. IEEE Trans. Affect. Comput. 3, 237–249 (2012).Valenza, G., Citi, L., Lanatá, A., Scilingo, E. P. & Barbieri, R. Revealing real-time emotional responses: a personalized assessment based on heartbeat dynamics. Sci. Rep. 4, 4998 (2014).Valenza, G. et al. Wearable monitoring for mood recognition in bipolar disorder based on history-dependent long-term heart rate variability analysis. IEEE J. Biomed. Heal. Informatics 18, 1625–1635 (2014).Piwek, L., Ellis, D. A., Andrews, S. & Joinson, A. The Rise of Consumer Health Wearables: Promises and Barriers. PLoS Med. 13, 1–9 (2016).Xu, J., Mitra, S., Van Hoof, C., Yazicioglu, R. & Makinwa, K. A. A. Active Electrodes for Wearable EEG Acquisition: Review and Electronics Design Methodology. IEEE Rev. Biomed. Eng. 3333, 1–1 (2017).Kumari, P., Mathew, L. & Syal, P. Increasing trend of wearables and multimodal interface for human activity monitoring: A review. Biosens. Bioelectron. 90, 298–307 (2017).He, C., Yao, Y. & Ye, X. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors. In Wearable Sensors and Robots: Proceedings of International Conference on Wearable Sensors and Robots 2015 (eds Yang, C., Virk, G. S. & Yang, H.) 15–25. https://doi.org/10.1007/978-981-10-2404-7_2 (Springer Singapore, 2017).Nakisa, B., Rastgoo, M. N., Tjondronegoro, D. & Chandran, V. Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 93, 143–155 (2018).Kory Jacqueline, D. & Sidney, K. Affect Elicitation for Affective Computing. In The Oxford Handbook of Affective Computing 371–383 (2014).Ekman, P. The directed facial action task. In Handbook of emotion elicitation and assessment 47–53 (2007).Harmon-Jones, E., Amodio, D. M. & Zinner, L. R. Social psychological methods of emotion elicitation. Handb. Emot. elicitation Assess. 91–105, https://doi.org/10.2224/sbp.2007.35.7.863 (2007)Roberts, N. A., Tsai, J. L. & Coan, J. A. Emotion elicitation using dyadic interaction task. Handbook of Emotion Elicitation and Assessment 106–123 (2007).Nardelli, M., Valenza, G., Greco, A., Lanata, A. & Scilingo, E. P. Recognizing emotions induced by affective sounds through heart rate variability. IEEE Trans. Affect. Comput. 6, 385–394 (2015).Kim, J. Emotion Recognition Using Speech and Physiological Changes. Robust Speech Recognit. Underst. 265–280 (2007).Soleymani, M., Pantic, M. & Pun, T. Multimodal emotion recognition in response to videos (Extended abstract). 2015 Int. Conf. Affect. Comput. Intell. Interact. ACII 2015 3, 491–497 (2015).Baños, R. M. et al. Immersion and Emotion: Their Impact on the Sense of Presence. CyberPsychology Behav. 7, 734–741 (2004).Giglioli, I. A. C., Pravettoni, G., Martín, D. L. S., Parra, E. & Raya, M. A. A novel integrating virtual reality approach for the assessment of the attachment behavioral system. Front. Psychol. 8, 1–7 (2017).Marín-Morales, J., Torrecilla, C., Guixeres, J. & Llinares, C. Methodological bases for a new platform for the measurement of human behaviour in virtual environments. DYNA 92, 34–38 (2017).Vince, J. Introduction to virtual reality. (Media, Springer Science & Business, 2004).Alcañiz, M., Baños, R., Botella, C. & Rey, B. The EMMA Project: Emotions as a Determinant of Presence. PsychNology J. 1, 141–150 (2003).Vecchiato, G. et al. Neurophysiological correlates of embodiment and motivational factors during the perception of virtual architectural environments. Cogn. Process. 16, 425–429 (2015).Slater, M. & Wilbur, S. A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence Teleoperators Virtual Environ. 6, 603–616 (1997).Riva, G. et al. Affective Interactions Using Virtual Reality: The Link between Presence and Emotions. CyberPsychology Behav. 10, 45–56 (2007).Baños, R. M. et al Changing induced moods via virtual reality. In International Conference on Persuasive Technology (ed. Springer, Berlin, H.) 7–15, https://doi.org/10.1007/11755494_3 (2006).Baños, R. M. et al. Positive mood induction procedures for virtual environments designed for elderly people. Interact. Comput. 24, 131–138 (2012).Gorini, A. et al. Emotional Response to Virtual Reality Exposure across Different Cultures: The Role of the AttributionProcess. CyberPsychology Behav. 12, 699–705 (2009).Gorini, A., Capideville, C. S., De Leo, G., Mantovani, F. & Riva, G. The Role of Immersion and Narrative in Mediated Presence: The Virtual Hospital Experience. Cyberpsychology, Behav. Soc. Netw. 14, 99–105 (2011).Chirico, A. et al. Effectiveness of Immersive Videos in Inducing Awe: An Experimental Study. Sci. Rep. 7, 1–11 (2017).Blascovich, J. et al. Immersive Virtual Environment Technology as a Methodological Tool for Social Psychology. Psychol. Inq. 7965, 103–124 (2012).Peperkorn, H. M., Alpers, G. W. & Mühlberger, A. Triggers of fear: Perceptual cues versus conceptual information in spider phobia. J. Clin. Psychol. 70, 704–714 (2014).McCall, C., Hildebrandt, L. K., Bornemann, B. & Singer, T. Physiophenomenology in retrospect: Memory reliably reflects physiological arousal during a prior threatening experience. Conscious. Cogn. 38, 60–70 (2015).Hildebrandt, L. K., Mccall, C., Engen, H. G. & Singer, T. Cognitive flexibility, heart rate variability, and resilience predict fine-grained regulation of arousal during prolonged threat. Psychophysiology 53, 880–890 (2016).Notzon, S. et al. Psychophysiological effects of an iTBS modulated virtual reality challenge including participants with spider phobia. Biol. Psychol. 112, 66–76 (2015).Amaral, C. P., Simões, M. A., Mouga, S., Andrade, J. & Castelo-Branco, M. A novel Brain Computer Interface for classification of social joint attention in autism and comparison of 3 experimental setups: A feasibility study. J. Neurosci. Methods 290, 105–115 (2017).Eudave, L. & Valencia, M. Physiological response while driving in an immersive virtual environment. 2017 IEEE 14th Int. Conf. Wearable Implant. Body Sens. Networks 145–148, https://doi.org/10.1109/BSN.2017.7936028 (2017).Sharma, G. et al. Influence of landmarks on wayfinding and brain connectivity in immersive virtual reality environment. Front. Psychol. 8, 1–12 (2017).Bian, Y. et al. A framework for physiological indicators of flow in VR games: construction and preliminary evaluation. Pers. Ubiquitous Comput. 20, 821–832 (2016).Egan, D. et al. An evaluation of Heart Rate and Electrodermal Activity as an Objective QoE Evaluation method for Immersive Virtual Reality Environments. 3–8, https://doi.org/10.1109/QoMEX.2016.7498964 (2016).Meehan, M., Razzaque, S., Insko, B., Whitton, M. & Brooks, F. P. Review of four studies on the use of physiological reaction as a measure of presence in stressful virtual environments. Appl. Psychophysiol. Biofeedback 30, 239–258 (2005).Higuera-Trujillo, J. L., López-Tarruella Maldonado, J. & Llinares Millán, C. Psychological and physiological human responses to simulated and real environments: A comparison between Photographs, 360° Panoramas, and Virtual Reality. Appl. Ergon. 65, 398–409 (2016).Felnhofer, A. et al. Is virtual reality emotionally arousing? Investigating five emotion inducing virtual park scenarios. Int. J. Hum. Comput. Stud. 82, 48–56 (2015).Anderson, A. P. et al. Relaxation with Immersive Natural Scenes Presented Using Virtual Reality. Aerosp. Med. Hum. Perform. 88, 520–526 (2017).Higuera, J. L. et al. Emotional cartography in design: A novel technique to represent emotional states altered by spaces. In D and E 2016: 10th International Conference on Design and Emotion 561–566 (2016).Kroenke, K., Spitzer, R. L. & Williams, J. B. W. The PHQ-9: Validity of a brief depression severity measure. J. Gen. Intern. Med. 16, 606–613 (2001).Bradley, M. M. & Lang, P. J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25, 49–59 (1994).Lang, P. J., Bradley, M. M. & Cuthbert, B. N. International Affective Picture System (IAPS): Technical Manual and Affective Ratings. NIMH Cent. Study Emot. Atten. 39–58, https://doi.org/10.1027/0269-8803/a000147 (1997).Nanda, U., Pati, D., Ghamari, H. & Bajema, R. Lessons from neuroscience: form follows function, emotions follow form. Intell. Build. Int. 5, 61–78 (2013).Russell, J. A. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980).Sejima, K. Kazuyo Sejima. 1988–1996. El Croquis 15 (1996).Ochiai, H. et al. Physiological and Psychological Effects of Forest Therapy on Middle-Aged Males with High-NormalBlood Pressure. Int. J. Environ. Res. Public Health 12, 2532–2542 (2015).Noguchi, H. & Sakaguchi, T. Effect of illuminance and color temperature on lowering of physiological activity. Appl. Hum. Sci. 18, 117–123 (1999).Küller, R., Mikellides, B. & Janssens, J. Color, arousal, and performance—A comparison of three experiments. Color Res. Appl. 34, 141–152 (2009).Yildirim, K., Hidayetoglu, M. L. & Capanoglu, A. Effects of interior colors on mood and preference: comparisons of two living rooms. Percept. Mot. Skills 112, 509–524 (2011).Hogg, J., Goodman, S., Porter, T., Mikellides, B. & Preddy, D. E. Dimensions and determinants of judgements of colour samples and a simulated interior space by architects and non‐architects. Br. J. Psychol. 70, 231–242 (1979).Jalil, N. A., Yunus, R. M. & Said, N. S. Environmental Colour Impact upon Human Behaviour: A Review. Procedia - Soc. Behav. Sci. 35, 54–62 (2012).Jacobs, K. W. & Hustmyer, F. E. Effects of four psychological primary colors on GSR, heart rate and respiration rate. Percept. Mot. Skills 38, 763–766 (1974).Jin, H. R., Yu, M., Kim, D. W., Kim, N. G. & Chung, A. S. W. Study on Physiological Responses to Color Stimulation. In International Association of Societies of Design Research (ed. Poggenpohl, S.) 1969–1979 (Korean Society of Design Science, 2009).Vartanian, O. et al. Impact of contour on aesthetic judgments and approach-avoidance decisions in architecture. Proc. Natl. Acad. Sci. 110, 1–8 (2013).Tsunetsugu, Y., Miyazaki, Y. & Sato, H. Visual effects of interior design in actual-size living rooms on physiological responses. Build. Environ. 40, 1341–1346 (2005).Stamps, A. E. Physical Determinants of Preferences for Residential Facades. Environ. Behav. 31, 723–751 (1999).Berlyne, D. E. Novelty, Complexity, and Hedonic Value. Percept. Psychophys. 8, 279–286 (1970).Krueger, R. A. & Casey, M. Focus groups: a practical guide for applied research. (Sage Publications, 2000).Acharya, U. R., Joseph, K. P., Kannathal, N., Lim, C. M. & Suri, J. S. Heart rate variability: A review. Med. Biol. Eng. Comput. 44, 1031–1051 (2006).Tarvainen, M. P., Niskanen, J. P., Lipponen, J. A., Ranta-aho, P. O. & Karjalainen, P. A. Kubios HRV - Heart rate variability analysis software. Comput. Methods Programs Biomed. 113, 210–220 (2014).Pan, J. & Tompkins, W. J. A real-time QRS detection algorithm. Biomed. Eng. IEEE Trans. 1, 230–236 (1985).Tarvainen, M. P., Ranta-aho, P. O. & Karjalainen, P. A. An advanced detrending method with application to HRV analysis. IEEE Trans. Biomed. Eng. 49, 172–175 (2002).Valenza, G. et al. Predicting Mood Changes in Bipolar Disorder Through HeartbeatNonlinear Dynamics. IEEE J. Biomed. Heal. Informatics 20, 1034–1043 (2016).Pincus, S. & Viscarello, R. Approximate Entropy A regularity measure for fetal heart rate analysis. Obstet. Gynecol. 79, 249–255 (1992).Richman, J. & Moorman, J. Physiological time-series analysis using approximate entropy and sample entropy. Am J Physiol Hear. Circ Physiol 278, H2039–H2049 (2000).Peng, C.-K., Havlin, S., Stanley, H. E. & Goldberger, A. L. Quantification of scaling exponents and crossover phenomena in nonstationary heartbeat time series. Chaos 5, 82–87 (1995).Grassberger, P. & Procaccia, I. Characterization of strange attractors. Phys. Rev. Lett. 50, 346–349 (1983).Delorme, A. & Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 134, 9–21 (2004).Colomer Granero, A. et al. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents. Front. Comput. Neurosci. 10, 1–14 (2016).Kober, S. E., Kurzmann, J. & Neuper, C. Cortical correlate of spatial presence in 2D and 3D interactive virtual reality: An EEG study. Int. J. Psychophysiol. 83, 365–374 (2012).Hyvärinen, A. & Oja, E. Independent component analysis: Algorithms and applications. Neural Networks 13, 411–430 (2000).Welch, P. D. The Use of Fast Fourier Transform for the Estimation of Power Spectra: A Method Based on Time Aver. aging Over Short, Modified Periodograms. IEEE Trans. AUDIO Electroacoust. 15, 70–73 (1967).Mormann, F., Lehnertz, K., David, P. & Elger, E. C. Mean phase coherence as a measure for phase synchronization and its application to the EEG of epilepsy patients. Phys. D Nonlinear Phenom. 144, 358–369 (2000).Jolliffe, I. T. Principal Component Analysis, Second Edition. Encycl. Stat. Behav. Sci. 30, 487 (2002).Schöllkopf, B., Smola, A. J., Williamson, R. C. & Bartlett, P. L. New support vector algorithms. Neural Comput 12, 1207–1245 (2000).Yan, K. & Zhang, D. Feature selection and analysis on correlated gas sensor data with recursive feature elimination. Sensors Actuators, B Chem. 212, 353–363 (2015).Chang, C.-C. & Lin, C.-J. Libsvm: A Library for Support Vector Machines. ACM Trans. Intell. Syst. Technol. 2, 1–27 (2011).Lewis, P. A., Critchley, H. D., Rotshtein, P. & Dolan, R. J. Neural correlates of processing valence and arousal in affective words. Cereb. Cortex 17, 742–748 (2007).McCall, C., Hildebrandt, L. K., Hartmann, R., Baczkowski, B. M. & Singer, T. Introducing the Wunderkammer as a tool for emotion research: Unconstrained gaze and movement patterns in three emotionally evocative virtual worlds. Comput. Human Behav. 59, 93–107 (2016).Blake, J. & Gurocak, H. B. Haptic glove with MR brakes for virtual reality. IEEE/ASME Trans. Mechatronics 14, 606–615 (2009).Heydarian, A. et al. Immersive virtual environments versus physical built environments: A benchmarking study for building design and user-built environment explorations. Autom. Constr. 54, 116–126 (2015).Kuliga, S. F., Thrash, T., Dalton, R. C. & Hölscher, C. Virtual reality as an empirical research tool - Exploring user experience in a real building and a corresponding virtual model. Comput. Environ. Urban Syst. 54, 363–375 (2015).Yeom, D., Choi, J.-H. & Zhu, Y. Investigation of the Physiological Differences between Immersive Virtual Environment and Indoor Enviorment in a Building. Indoor adn Built Enviornment 0, Accept (2017).Combrisson, E. & Jerbi, K. Exceeding chance level by chance: The caveat of theoretical chance levels in brain signal classification and statistical assessment of decoding accuracy. J. Neurosci. Methods 250, 126–136 (2015).He, C., Yao, Y. & Ye, X. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors. In Wearable Sensors and Robots: Proceedings of International Conference on Wearable Sensors and Robots 2015 (eds. Yang, C., Virk, G. S. & Yang, H.) 15–25, https://doi.org/10.1007/978-981-10-2404-7_2 (Springer Singapore, 2017)

    Summary and Conclusions

    No full text

    Emotion Recognition using Speech Features

    No full text
    “Emotion Recognition Using Speech Features” covers emotion-specific features present in speech and discussion of suitable models for capturing emotion-specific information for distinguishing different emotions.  The content of this book is important for designing and developing  natural and sophisticated speech systems. Drs. Rao and Koolagudi lead a discussion of how emotion-specific information is embedded in speech and how to acquire emotion-specific knowledge using appropriate statistical models. Additionally, the authors provide information about using evidence derived from various features and models. The acquired emotion-specific knowledge is useful for synthesizing emotions. Discussion includes global and local prosodic features at syllable, word and phrase levels, helpful for capturing emotion-discriminative information; use of complementary evidences obtained from excitation sources, vocal tract systems and prosodic features in order to enhance the emotion recognition performance;  and proposed multi-stage and hybrid models for improving the emotion recognition performance

    Identification of Hindi Dialects and Emotions using Spectral and Prosodic features of Speech

    No full text
    In this paper, we have explored speech features to identify Hindi dialects and emotions. A dialect is any distinguishable variety of a language spoken by a group of people. Emotions provide naturalness to speech. In this work, five prominent dialects of Hindi are considered for the identification task. They are Chattisgharhi (spoken in central India), Bengali (Bengali accented Hindi spoken in Eastern region), Marathi (Marathi accented Hindi spoken in Western region), General (Hindi spoken in Northern region) and Telugu (Telugu accented Hindi spoken in Southern region). Along with dialect identification, we have also carried out emotion recognition in this work. Speech database considered for dialect identification task consists of spontaneous speech spoken by male and female speakers. Indian Institute of Technology Kharagpur Simulated Emotion Hindi Speech Corpus (IITKGP-SEHSC) is used for conducting the emotion recognition studies. The emotions considered in this study are anger, disgust, fear, happy, neutral and sad. Prosodic and spectral features extracted from speech are used for discriminating the dialects and emotions. Spectral features are represented by Mel frequency cepstral coefficients (MFCC) and prosodic features are represented by durations of syllables, pitch and energy contours. Auto-associative neural network (AANN) models and Support Vector Machines (SVM) are explored for capturing the dialect specific and emotion specific information from the above specified features. AANN models are expected to capture the nonlinear relations specific to dialects or emotions through the distributions of feature vectors. SVMs perform dialect or emotion classification based on discriminative characteristics present among the dialects or emotions. Classification systems are developed separately for dialect classification and emotion classification. Recognition performance of the dialect identification and emotion recognition systems is found to be 81% and 78% respectively

    Robust emotion recognition using spectral and prosodic features

    No full text
    In this brief, the authors discuss recently explored spectral (sub-segmental and pitch synchronous) and prosodic (global and local features at word and syllable levels in different parts of the utterance) features for discerning emotions in a robust manner. The authors also delve into the complementary evidences obtained from excitation source, vocal tract system and prosodic features for the purpose of enhancing emotion recognition performance. Features based on speaking rate characteristics are explored with the help of multi-stage and hybrid models for further improving emotion recognition performance. Proposed spectral and prosodic features are evaluated on real life emotional speech corpus

    Selection of Suitable Features for Modeling the Durations of Syllables

    No full text

    Emotion Recognition using Speech Features

    No full text
    XII, 124 p. 30 illus., 6 illus. in color.online r
    corecore