37 research outputs found

    Sour promotes risk-taking: an investigation into the effect of taste on risk-taking behaviour in humans

    Get PDF
    Taking risks is part of everyday life. Some people actively pursue risky activities (e.g., jumping out of a plane), while others avoid any risk (e.g., people with anxiety disorders). Paradoxically, risk-taking is a primitive behaviour that may lead to a happier life by offering a sense of excitement through self-actualization. Here, we demonstrate for the first time that sour - amongst the five basic tastes (sweet, bitter, sour, salty, and umami) - promotes risk-taking. Based on a series of three experiments, we show that sour has the potential to modulate risk-taking behaviour across two countries (UK and Vietnam), across individual differences in risk-taking personality and styles of thinking (analytic versus intuitive). Modulating risk-taking can improve everyday life for a wide range of people

    TasteBud: bring taste back into the game

    Get PDF
    When we are babies we put anything and everything in our mouths, from Lego to crayons. As we grow older we increasingly rely on our other senses to explore our surroundings and objects in the world. When interacting with technology, we mainly rely on our senses of vision, touch, and hearing, and the sense of taste becomes reduced to the context of eating and food experiences. In this paper, we build on initial efforts to enhance gaming experiences through gustatory stimuli. We introduce TasteBud, a gustatory gaming interface that we integrated with the classic Minesweeper game. We first describe the details on the hardware and software design for the taste stimulation and then present initial findings from a user study. We discuss how taste has the potential to transform gaming experiences through systematically exploiting the experiences individual gustatory stimuli (e.g., sweet, bitter, sour) can elicit

    Error related negativity in observing interactive tasks

    Get PDF
    Error Related Negativity is triggered when a user either makes a mistake or the application behaves differently from their expectation. It can also appear while observing another user making a mistake. This paper investigates ERN in collaborative settings where observing another user (the executer) perform a task is typical and then explores its applicability to HCI. We first show that ERN can be detected on signals captured by commodity EEG headsets like an Emotiv headset when observing another person perform a typical multiple-choice reaction time task. We then investigate the anticipation effects by detecting ERN in the time interval when an executer is reaching towards an answer. We show that we can detect this signal with both a clinical EEG device and with an Emotiv headset. Our results show that online single trial detection is possible using both headsets during tasks that are typical of collaborative interactive applications. However there is a trade-off between the detection speed and the quality/prices of the headsets. Based on the results, we discuss and present several HCI scenarios for use of ERN in observing tasks and collaborative settings

    LeviSense: a platform for the multisensory integration in levitating food and insights into its effect on flavour perception

    Get PDF
    Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved, even if we are not aware of it. However, while multisensory integration has been well studied in psychology, there is not a single platform for testing systematically the effects of different stimuli. This lack of platform results in unresolved design challenges for the design of taste-based immersive experiences. Here, we present LeviSense: the first system designed for multisensory integration in gustatory experiences based on levitated food. Our system enables the systematic exploration of different sensory effects on eating experiences. It also opens up new opportunities for other professionals (e.g., molecular gastronomy chefs) looking for innovative taste-delivery platforms. We describe the design process behind LeviSense and conduct two experiments to test a subset of the crossmodal combinations (i.e., taste and vision, taste and smell). Our results show how different lighting and smell conditions affect the perceived taste intensity, pleasantness, and satisfaction. We discuss how LeviSense creates a new technical, creative, and expressive possibilities in a series of emerging design spaces within Human-Food Interaction

    Multisensory experiences in HCI

    Get PDF
    The use of vision and audition for interaction dominated the field of human-computer interaction (HCI) for decades, despite the fact that nature has provided us with many more senses for perceiving and interacting with the world around us. Recently, HCI researchers have started trying to capitalize on touch, taste, and smell when designing interactive tasks, especially in gaming, multimedia, and art environments. Here we provide a snapshot of our research into touch, taste, and smell, which we’re carrying out at the Sussex Computer Human Interaction (SCHI—pronounced “sky”) Lab at the University of Sussex in Brighton, UK

    Agency in mid-air interfaces

    Get PDF
    Touchless interfaces allow users to view, control and manipulate digital content without physically touching an interface. They are being explored in a wide range of application scenarios from medical surgery to car dashboard controllers. One aspect of touchless interaction that has not been explored to date is the Sense of Agency (SoA). The SoA refers to the subjective experience of voluntary control over actions in the external world. In this paper, we investigated the SoA in touchless systems using the intentional binding paradigm. We first compare touchless systems with physical interactions and then augmented different types of haptic feedback to explore how different outcome modalities influence users’ SoA. From our experiments, we demonstrated that an intentional binding effect is observed in both physical and touchless interactions with no statistical difference. Additionally, we found that haptic and auditory feedback help to increase SoA compared with visual feedback in touchless interfaces. We discuss these findings and identify design opportunities that take agency into consideration
    corecore