This paper describes a new user–machine interactive scheme using cross-modal computation. The scheme builds on our previous study that used eye gaze detection alone to extract visual preference from users. However, that type of interaction scheme was insufficient since it was unable to detect emotional intensity. Therefore, our proposed interaction will suggest how different sensor modalities can be used to extract emotional intensity from given visual stimuli. In addition, repeated interactions are required for acquiring emotional visual stimuli. Hence habituation detection must also be taken into account. It was found that this new proposed scheme was capable of achieving accurate interest estimation with our cross-modal computation
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.