Skip to main content
Article thumbnail
Location of Repository

Multimodal interaction with emotional feedback

By Francesco Cutugno, Antonio Origlia and Roberto Rinaldi

Abstract

Abstract. In this paper we extend a multimodal framework based on speech and gestures to include emotional information by means of anger detection. In recent years multimodal interaction has become of great interest thanks to the increasing availability of mobile devices allowing a number of different interaction modalities. Taking intelligent decisions is a complex task for automated systems as multimodality requires procedures to integrate different events to be interpreted as a single intention of the user and it must take into account that different kinds of information could come from a single channel as in the case of speech, which conveys a user’s intentions using syntax and prosody both.

Year: 2014
OAI identifier: oai:CiteSeerX.psu:10.1.1.416.1597
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://ceur-ws.org/Vol-860/pap... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.