Users react differently to non-relevant and relevant tags associated with content. These spontaneous reactions can be used for labeling large multimedia databases. We present a method to assess tag relevance to images using the nonverbalbodilyresponses, namely,electroencephalogram (EEG), facial expressions, and eye gaze. We conducted experiments in which 28 images were shown to 28 subjects once with correct and another time with incorrect tags. The goal of our system is to detect the responses to non-relevant tags and consequently filter them out. Therefore, we trained classifiers to detect the tag relevance from bodily responses. We evaluated the performance of our system using a subject independent approach. The precision at top 5 % and top 10% detections were calculated and results of different modalities and different classifiers were compared. The results show that eye gaze outperforms the other modalities in tag relevance detection both overall and for top ranked results. Categories and Subject Descriptor
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.