3 research outputs found

    Augmented reality flavor: cross-modal mapping across gustation, olfaction, and vision

    Get PDF
    Gustatory display research is still in its infancy despite being one of the essential everyday senses that human practice while eating and drinking. Indeed, the most important and frequent tasks that our brain deals with every day are foraging and feeding. The recent studies by psychologists and cognitive neuroscientist revealed how complex multisensory rely on the integration of cues from all the human senses in any flavor experiences. The perception of flavor is multisensory and involves combinations of gustatory and olfactory stimuli. The cross-modal mapping between these modalities needs to be more explored in the virtual environment and simulation, especially in liquid food. In this paper, we present a customized wearable Augmented Reality (AR) system and olfaction display to study the effect of vision and olfaction on the gustatory sense. A user experiment and extensive analysis conducted to study the influence of each stimulus on the overall flavor, including other factors like age, previous experience in Virtual Reality (VR)/AR, and beverage consumption. The result showed that smell contributes strongly to the flavor with less contribution to the vision. However, the combination of these stimuli can deliver richer experience and a higher belief rate. Beverage consumption had a significant effect on the flavor belief rate. Experience is correlated with stimulus and age is correlated with belief rate, and both indirectly affected the belief rate. 2021, The Author(s).This work was partly supported by NPRP Grant #NPRP 11S-1219-170106 from the Qatar National Research Fund (a member of Qatar Foundation). This work also was not possible without the effort of students Babkir Elnimah, Ali Hazi, and Ahmed Ibrahim as part of their senior project.Scopu

    Incorporating Olfactory into a Multi-Modal Surgical Simulation

    No full text

    Incorporating olfactory into a multi-modal surgical simulation

    Get PDF
    This paper proposes a novel multimodal interactive surgical simulator that incorporates haptic, olfactory, as well as traditional vision feedback. A scent diffuser was developed to produce odors when errors occur. Haptic device was used to provide the sense of touch to the user. The preliminary results show that adding smell as an aid to the simulation enhanced the memory retention that lead to better performance. Copyright 2014 The Institute of Electronics, Information and Communication Engineers.Scopu
    corecore