21 research outputs found

    Master of Science

    Get PDF
    thesisThis thesis discusses the development of an olfactory display for the University of Utah TreadPort Virtual Environment (UUTVE). The goal of the UUTVE is to create a virtual environment that is as life like as possible by communicating to the user as many of the sensations felt in moving around in real the world as possible, while staying within the confines of the virtual environment's workspace. The UUTVE has a visual display, auditory display, a locomotion interface and wind display. With the wind display, it is possible to create an effective olfactory display that does not have some of the limitations associated with many of the current olfactory displays. The inclusion of olfactory information in virtual environments is becoming increasingly common as the effects of including an olfactory display show an increase in user presence. The development of the olfactory display for the UUTVE includes the following components: the physical apparatus for injecting scent particles into the air stream, the development of a Computational Fluid Dynamics (CFD) model with which to control the concentration of scent being sensed by the user, and user studies to verify the model and show as proof of concept that the wind tunnel can be used to create an olfactory display. The physical apparatus of the display consists of air atomizing nozzles, solenoids for controlling when the scents are released, containers for holding the scents and a pressurized air tank used to provide the required air to make the nozzles work. CFD is used model the wind flow through the TPAWT. The model of the wind flow is used to simulate how particles advect in the wind tunnel. These particle dispersion simulations are then used to create a piecewise model that is able to predict the scent's concentration behavior as the odor flows through the wind tunnel. The user studies show that the scent delivery system is able to display an odor to a person standing in the TPAWT. The studies also provided a way to measure the time it takes for a person to recognize an odor after it has been released into the air stream, and also the time it takes for a user to recognize that the odor is no longer present

    The sweet smell of success: Enhancing multimedia applications with olfaction

    Get PDF
    This is the Post-Print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACMOlfaction, or smell, is one of the last challenges which multimedia applications have to conquer. As far as computerized smell is concerned, there are several difficulties to overcome, particularly those associated with the ambient nature of smell. In this article, we present results from an empirical study exploring users' perception of olfaction-enhanced multimedia displays. Findings show that olfaction significantly adds to the user multimedia experience. Moreover, use of olfaction leads to an increased sense of reality and relevance. Our results also show that users are tolerant of the interference and distortion effects caused by olfactory effect in multimedia

    The influence of olfaction on the perception of high-fidelity computer graphics

    Get PDF
    The computer graphics industry is constantly demanding more realistic images and animations. However, producing such high quality scenes can take a long time, even days, if rendering on a single PC. One of the approaches that can be used to speed up rendering times is Visual Perception, which exploits the limitations of the Human Visual System, since the viewers of the results will be humans. Although there is an increasing body of research into how haptics and sound may affect a viewer's perception in a virtual environment, the in uence of smell has been largely ignored. The aim of this thesis is to address this gap and make smell an integral part of multi-modal virtual environments. In this work, we have performed four major experiments, with a total of 840 participants. In the experiments we used still images and animations, related and unrelated smells and finally, a multi-modal environment was considered with smell, sound and temperature. Beside this, we also investigated how long it takes for an average person to adapt to smell and what affect there may be when performing a task in the presence of a smell. The results of this thesis clearly show that a smell present in the environment firstly affects the perception of object quality within a rendered image, and secondly, enables parts of the scene or the whole animation to be selectively rendered in high quality while the rest can be rendered in a lower quality without the viewer noticing the drop in quality. Such selective rendering in the presence of smell results in significant computational performance gains without any loss in the quality of the image or animations perceived by a viewer

    Digitizing the chemical senses: possibilities & pitfalls

    Get PDF
    Many people are understandably excited by the suggestion that the chemical senses can be digitized; be it to deliver ambient fragrances (e.g., in virtual reality or health-related applications), or else to transmit flavour experiences via the internet. However, to date, progress in this area has been surprisingly slow. Furthermore, the majority of the attempts at successful commercialization have failed, often in the face of consumer ambivalence over the perceived benefits/utility. In this review, with the focus squarely on the domain of Human-Computer Interaction (HCI), we summarize the state-of-the-art in the area. We highlight the key possibilities and pitfalls as far as stimulating the so-called ‘lower’ senses of taste, smell, and the trigeminal system are concerned. Ultimately, we suggest that mixed reality solutions are currently the most plausible as far as delivering (or rather modulating) flavour experiences digitally is concerned. The key problems with digital fragrance delivery are related to attention and attribution. People often fail to detect fragrances when they are concentrating on something else; And even when they detect that their chemical senses have been stimulated, there is always a danger that they attribute their experience (e.g., pleasure) to one of the other senses – this is what we call ‘the fundamental attribution error’. We conclude with an outlook on digitizing the chemical senses and summarize a set of open-ended questions that the HCI community has to address in future explorations of smell and taste as interaction modalities

    Information Olfactation: Theory, Design, and Evaluation

    Get PDF
    Olfactory feedback for analytical tasks is a virtually unexplored area in spite of the advantages it offers for information recall, feature identification, and location detection. Here we introduce the concept of ‘Information Olfactation’ as the fragrant sibling of information visualization, and discuss how scent can be used to convey data. Building on a review of the human olfactory system and mirroring common visualization practice, we propose olfactory marks, the substrate in which they exist, and their olfactory channels that are available to designers. To exemplify this idea, we present ‘viScent(1.0)’: a six-scent stereo olfactory display capable of conveying olfactory glyphs of varying temperature and direction, as well as a corresponding software system that integrates the display with a traditional visualization display. We also conduct a comprehensive perceptual experiment on Information Olfactation: the use of olfactory marks and channels to convey data. More specifically, following the example from graphical perception studies, we design an experiment that studies the perceptual accuracy of four ``olfactory channels''---scent type, scent intensity, airflow, and temperature---for conveying three different types of data---nominal, ordinal, and quantitative. We also present details of an advanced 24-scent olfactory display: ‘viScent(2.0)’ and its software framework that we designed in order to run this experiment. Our results yield a ranking of olfactory channels for each data type that follows similar principles as rankings for visual channels, such as those derived by Mackinlay, Cleveland & McGill, and Bertin

    Symbolic olfactory display

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2001.Includes bibliographical references (p. 123-143).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.This thesis explores the problems and possibilities of computer-controlled scent output. I begin with a thorough literature review of how we smell and how scents are categorized. I look at applications of aroma through the ages, with particular emphasis on the role of scent in information display in a variety of media. I then present and discuss several projects I have built to explore the use of computer-controlled olfactory display, and some pilot studies of issues related to such display. I quantify human physical limitations on olfactory input, and conclude that olfactory display must rely on differences between smells, and not differences in intensity of the same smell. I propose a theoretical framework for scent in human-computer interactions, and develop concepts of olfactory icons and 'smicons'. I further conclude that scent is better suited for display slowly changing, continuous information than discrete events. I conclude with my predictions for the prospects of symbolic, computer-controlled, olfactory display.by Joseph Nathaniel Kaye.S.M

    LeviSense: A platform for the multisensory integration in levitating food and insights into its effect on flavour perception

    Get PDF
    Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved, even if we are not aware of it. However, while multisensory integration has been well studied in psychology, there is not a single platform for testing systematically the effects of different stimuli. This lack of platform results in unresolved design challenges for the design of taste-based immersive experiences. Here, we present LeviSense: the first system designed for multisensory integration in gustatory experiences based on levitated food. Our system enables the systematic exploration of different sensory effects on eating experiences. It also opens up new opportunities for other professionals (e.g., molecular gastronomy chefs) looking for innovative taste-delivery platforms. We describe the design process behind LeviSense and conduct two experiments to test a subset of the crossmodal combinations (i.e., taste and vision, taste and smell). Our results show how different lighting and smell conditions affect the perceived taste intensity, pleasantness, and satisfaction. We discuss how LeviSense creates a new technical, creative, and expressive possibilities in a series of emerging design spaces within Human-Food Interaction
    corecore