2 research outputs found

    Augmented reality flavor: cross-modal mapping across gustation, olfaction, and vision

    Get PDF
    Gustatory display research is still in its infancy despite being one of the essential everyday senses that human practice while eating and drinking. Indeed, the most important and frequent tasks that our brain deals with every day are foraging and feeding. The recent studies by psychologists and cognitive neuroscientist revealed how complex multisensory rely on the integration of cues from all the human senses in any flavor experiences. The perception of flavor is multisensory and involves combinations of gustatory and olfactory stimuli. The cross-modal mapping between these modalities needs to be more explored in the virtual environment and simulation, especially in liquid food. In this paper, we present a customized wearable Augmented Reality (AR) system and olfaction display to study the effect of vision and olfaction on the gustatory sense. A user experiment and extensive analysis conducted to study the influence of each stimulus on the overall flavor, including other factors like age, previous experience in Virtual Reality (VR)/AR, and beverage consumption. The result showed that smell contributes strongly to the flavor with less contribution to the vision. However, the combination of these stimuli can deliver richer experience and a higher belief rate. Beverage consumption had a significant effect on the flavor belief rate. Experience is correlated with stimulus and age is correlated with belief rate, and both indirectly affected the belief rate. 2021, The Author(s).This work was partly supported by NPRP Grant #NPRP 11S-1219-170106 from the Qatar National Research Fund (a member of Qatar Foundation). This work also was not possible without the effort of students Babkir Elnimah, Ali Hazi, and Ahmed Ibrahim as part of their senior project.Scopu

    Incorporating haptic and olfactory into surgical simulation

    No full text
    Recently, surgical simulation is a widely used method to train surgeons on specific surgeries due to the fact that it helps reducing surgical errors. Available surgical simulations lack realism since they only incorporate one or two senses which are vision and hap tic. This paper proposes a novel multimode interactive surgical simulator that incorporates hap tic, olfactory, as well as traditional vision feedback. A scent diffuser was created and developed to interact with the simulation, in order to produce odors when errors took place. Phantom hap tic device was used to provide the sense of touch to the user. Our system has been tested and evaluated and the results show that incorporating more senses to the simulation enhances the performance of the user. This is due to the fact that using olfaction sensation increases the remembrance of the trainee. 2013 IEEE.Scopu
    corecore