Skip to main content
Article thumbnail
Location of Repository

Analysis and detection of cognitive load and frustration in drivers' speech

By Hynek Boril, Seyed Omid Sadjadi, Tristan Kleinschmidt and John H. L. Hansen

Abstract

Non-driving related cognitive load and variations of emotional state may impact a driver’s capability to control a vehicle and introduces driving errors. Availability of reliable cognitive load and emotion detection in drivers would benefit the design of active safety systems and other intelligent in-vehicle interfaces. In this study, speech produced by 68 subjects while driving in urban areas is analyzed. A particular focus is on speech production differences in two secondary cognitive tasks, interactions\ud with a co-driver and calls to automated spoken dialog systems (SDS), and two emotional states during the SDS interactions - neutral/negative. A number of speech parameters are found to vary across the cognitive/emotion classes. Suitability of selected cepstral- and production-based features for automatic cognitive\ud task/emotion classification is investigated. A fusion of\ud GMM/SVM classifiers yields an accuracy of 94.3% in cognitive\ud task and 81.3% in emotion classification

Topics: 080109 Pattern Recognition and Data Mining, 080602 Computer-Human Interaction, 090609 Signal Processing, Cognitive Load, Emotions, Speech Production Variations, Automatic Classification, Pattern Recognition
Publisher: International Speech Communication Association
Year: 2010
OAI identifier: oai:eprints.qut.edu.au:33168

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.