A visual interface for augmented human olfactory perception in the context of monitoring air quality. - Issue 1.2.0

Abstract

This report presents the experiments that were carried out to investigate ways in which an intelligent adaptive interface could support inhabitants in providing accurate smell descriptions. We investigated the effect of multi-modal odor cues on human smell identification performance to inform the development of an adaptive interface for a mobile application. This involved a data elicitation study (N=429) to collect people’s olfactory associations when exposed to nine sample odors. Based on these associations, we then developed a multimodal interface that offered textual, image or combined cues to augment subjects’ odor perception, and 190 new subjects used the interface to identify odors. We found that participants’ smell identification performance increased when the interface offered visual (image and/or text) cues for odor identification. Furthermore, participants experienced the combination of visual and textual cues as most useful and enjoyable. The results of this experiment show that human smell perception can be successfully enhanced with the help of an adaptive odor cue interface. We have used the results of this study to develop a first prototype of an intelligent interface that automatically generates cues to assist human smell identification. This prototype is based on causal models (Bayesian Networks). We extracted these observation models for a few relevant chemicals. Due to the lack of data, all types of chemicals could not be covered. Nevertheless, we have shown that construction of models supporting detection and localization using human reports is possible

    Similar works