Skip to main content
Article thumbnail
Location of Repository

General Terms

By Motohri Kon and Takamasa Koshizen

Abstract

This paper describes a new user–machine interactive scheme using cross-modal computation. The scheme builds on our previous study that used eye gaze detection alone to extract visual preference from users. However, that type of interaction scheme was insufficient since it was unable to detect emotional intensity. Therefore, our proposed interaction will suggest how different sensor modalities can be used to extract emotional intensity from given visual stimuli. In addition, repeated interactions are required for acquiring emotional visual stimuli. Hence habituation detection must also be taken into account. It was found that this new proposed scheme was capable of achieving accurate interest estimation with our cross-modal computation

Topics: user satisfaction, cross-modal computation, repeated
Year: 2009
OAI identifier: oai:CiteSeerX.psu:10.1.1.134.4941
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cs.mu.oz.au/~lcaved... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.