130 research outputs found
Recommended from our members
Examining the sense of agency in human-computer interaction
Humans are agents, we feel that we control the course of events on our everyday life. This refers to the Sense of Agency (SoA). This experience is not only crucial in our daily life, but also in our interaction with technology. When we manipulate a user interface (e.g., computer, smartphone, etc.), we expect that the system responds to our input commands with feedback, as we desire to feel that we are in charge of the interaction. If this interplay elicits a SoA, then the user will perceive an instinctive feeling of “I am controlling this”. Although research in Human-Computer Interaction (HCI) pursuits the design of intuitive and responsive systems, most of the current studies have been focussed mainly on interaction techniques (e.g., software-hardware) and User Experience (UX) (e.g., comfort, usability, etc.), and very little has been investigated in terms of the SoA i.e., the conscious experience of being in control regarding the interaction. In this thesis, we present an experimental exploration of the role of the SoA in interaction paradigms typical of HCI. After two chapters of introduction and related work, we describe a series of studies that explore agency implication in interaction with systems through human senses such as vision, audio, touch and smell. Chapter 3 explores the SoA in mid-air haptic interaction through touchless actions. Then, Chapter 4 examines agency modulation through smell and its application for olfactory interfaces. Chapter 5 describes two novel timing techniques based on auditory and haptic cues that provide alternative timing methods to the traditional Libet clock. Finally, we conclude with a discussion chapter that highlights the importance of our SoA during interactions with technology as well as the implications of the results found, in the design of user interfaces
I Smell Trouble: Using Multiple Scents To Convey Driving-Relevant Information
Cars provide drivers with task-related information (e.g. "Fill gas") mainly using visual and auditory stimuli. However, those stimuli may distract or overwhelm the driver, causing unnecessary stress. Here, we propose olfactory stimulation as a novel feedback modality to support the perception of visual notifications, reducing the visual demand of the driver. Based on previous research, we explore the application of the scents of lavender, peppermint, and lemon to convey three driving-relevant messages (i.e. "Slow down", "Short inter-vehicle distance", "Lane departure"). Our paper is the first to demonstrate the application of olfactory conditioning in the context of driving and to explore how multiple olfactory notifications change the driving behaviour. Our findings demonstrate that olfactory notifications are perceived as less distracting, more comfortable, and more helpful than visual notifications. Drivers also make less driving mistakes when exposed to olfactory notifications. We discuss how these findings inform the design of future in-car user interfaces
Recommended from our members
OSpace: towards a systematic exploration of olfactory interaction spaces
When designing olfactory interfaces, HCI researchers and practitioners have to carefully consider a number of issues related to the scent delivery, detection, and lingering. These are just a few of the problems to deal with. We present OSpace - an approach for designing, building, and exploring an olfactory interaction space. Our paper is the first to explore in detail not only the scent-delivery parameters but also the air extraction issues. We conducted a user study to demonstrate how the scent detection/lingering times can be acquired under different air extraction conditions, and how the impact of scent type, dilution, and intensity can be investigated. Results show that with our setup, the scents can be perceived by the user within ten seconds and it takes less than nine seconds for the scents to disappear, both when the extraction is on and off. We discuss the practical application of these results for HCI
Multimodality in VR: A survey
Virtual reality (VR) is rapidly growing, with the potential to change the way we create and consume content. In VR, users integrate multimodal sensory information they receive, to create a unified perception of the virtual world. In this survey, we review the body of work addressing multimodality in VR, and its role and benefits in user experience, together with different applications that leverage multimodality in many disciplines. These works thus encompass several fields of research, and demonstrate that multimodality plays a fundamental role in VR; enhancing the experience, improving overall performance, and yielding unprecedented abilities in skill and knowledge transfer
The influence of scent on virtual reality experiences: The role of aroma-content congruence
We live in a multisensory world. Our experiences are constructed by the stimulation of all our senses. Nevertheless, digital interactions are mainly based on audiovisual elements, while other sensory stimuli have been less explored. Virtual reality (VR) is a sensory-enabling technology that facilitates the integration of sensory inputs to enhance multisensory digital experiences. This study analyzes how the addition of ambient scent to a VR experience affects digital pre-experiences in a service context (tourism). Results from a laboratory experiment confirmed that embodied VR devices, together with pleasant and congruent ambient scents, enhance sensory stimulation, which directly (and indirectly through ease of imagination) influence affective and behavioral reactions. These enriched multisensory experiences strengthen the link between the affective and conative images of destinations. We make recommendations for researchers and service providers with ambitions to deliver ambient scents, especially those congruent with displayed content, to enhance the sensorialization of digital VR experiences
Serious Games for Stroke Telerehabilitation of Upper Limb - A Review for Future Research
Maintaining appropriate home rehabilitation programs after stroke, with proper adherence and remote monitoring is a challenging task. Virtual reality (VR) - based serious games could be a strategy used in telerehabilitation (TR) to engage patients in an enjoyable and therapeutic approach. The aim of this review was to analyze the background and quality of clinical research on this matter to guide future research. The review was based on research material obtained from PubMed and Cochrane up to April 2020 using the PRISMA approach. The use of VR serious games has shown evidence of efficacy on upper limb TR after stroke, but the evidence strength is still low due to a limited number of randomized controlled trials (RCT), a small number of participants involved, and heterogeneous samples. Although this is a promising strategy to complement conventional rehabilitation, further investigation is needed to strengthen the evidence of effectiveness and support the dissemination of the developed solutions
Beyond the Screen: Reshaping the Workplace with Virtual and Augmented Reality
Although extended reality technologies have enjoyed an explosion in
popularity in recent years, few applications are effectively used outside the
entertainment or academic contexts. This work consists of a literature review
regarding the effective integration of such technologies in the workplace. It
aims to provide an updated view of how they are being used in that context.
First, we examine existing research concerning virtual, augmented, and
mixed-reality applications. We also analyze which have made their way to the
workflows of companies and institutions. Furthermore, we circumscribe the
aspects of extended reality technologies that determined this applicability
Natural Walking in Virtual Reality:A Review
Recent technological developments have finally brought virtual reality (VR) out of the laboratory and into the hands of developers and consumers. However, a number of challenges remain. Virtual travel is one of the most common and universal tasks performed inside virtual environments, yet enabling users to navigate virtual environments is not a trivial challenge—especially if the user is walking. In this article, we initially provide an overview of the numerous virtual travel techniques that have been proposed prior to the commercialization of VR. Then we turn to the mode of travel that is the most difficult to facilitate, that is, walking. The challenge of providing users with natural walking experiences in VR can be divided into two separate, albeit related, challenges: (1) enabling unconstrained walking in virtual worlds that are larger than the tracked physical space and (2) providing users with appropriate multisensory stimuli in response to their interaction with the virtual environment. In regard to the first challenge, we present walking techniques falling into three general categories: repositioning systems, locomotion based on proxy gestures, and redirected walking. With respect to multimodal stimuli, we focus on how to provide three types of information: external sensory information (visual, auditory, and cutaneous), internal sensory information (vestibular and kinesthetic/proprioceptive), and efferent information. Finally, we discuss how the different categories of walking techniques compare and discuss the challenges still facing the research community.</jats:p
- …