8 research outputs found

    Quantitative measurement of tool embodiment for virtual reality input alternatives

    Get PDF
    Funding: Ontario Ministry of Research and Innovation and Science and the Natural Sciences and Engineering Research Council of Canada for funding.Virtual reality (VR) strives to replicate the sensation of the physical environment by mimicking people’s perceptions and experience of being elsewhere. These experiences are often mediated by the objects and tools we interact with in the virtual world (e.g., a controller). Evidence from psychology posits that when using the tool proficiently, it becomes embodied (i.e., an extension of one’s body). There is little work, however, on how to measure this phenomenon in VR, and on how different types of tools and controllers can affect the experience of interaction. In this work, we leverage cognitive psychology and philosophy literature to construct the Locus-of-Attention Index (LAI), a measure of tool embodiment. We designed and conducted a study that measures readiness-to-hand and unreadiness-to-hand for three VR interaction techniques: hands, a physical tool, and a VR controller. The study shows that LAI can measure differences in embodiment with working and broken tools and that using the hand directly results in more embodiment than using controllers.Postprin

    I feel it in my fingers! Sense of agency with mid-air haptics

    Get PDF
    Recent technological advances incorporate mid- air haptic feedback, enriching sensory experience during touchless virtual interactions. We investigated how this impacts the user’s sense of agency. Sense of agency refers to the feeling of controlling external events through one’s actions and has attracted growing interest from human-computer interaction researchers. This is mainly due to the fact that the user’s experience of control over a system is of primary importance. Here we measured sense of agency during a virtual button- pressing task, where the button press caused a tone outcome to occur after intervals of different durations. We explored the effect of manipulating a) mid-air haptic feedback and b) the latency of the virtual hand’s movement with respect to the actual hand movement. Sense of agency was quantified with implicit and explicit measures. Results showed that haptic feedback increased implicit sense of agency for the longest action-outcome interval length. Results also showed that latency led to a decrease in explicit sense of agency, but that this reduction was attenuated in the presence of haptic feedback. We discuss the implications of these findings, focusing on the idea that haptic feedback can be used to protect, or even increase, users’ experiences of agency in virtual interactions

    User-Defined Gestures with Physical Props in Virtual Reality

    Get PDF
    When building virtual reality (VR) environments, designers use physical props to improve immersion and realism. However, people may want to perform actions that would not be supported by physical objects, for example, duplicating an object in a Computer-Aided Design (CAD) program or darkening the sky in an open-world game. In this thesis, I present an elicitation study where I asked 21 participants to choose from 95 props to perform manipulative gestures for 20 referents (actions), typically found in CAD software or open-world games. I describe the resulting gestures as context-free grammars, capturing the actions taken by our participants, their prop choices, and how the props were used in each gesture. I present agreement scores between gesture choices and prop choices; to accomplish the latter, I developed a generalized agreement score that compares sets of selections rather than a single selection, enabling new types of elicitation studies. I found that props were selected according to their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support a wide variety of gestures

    Measuring Haptic Experience: Evidence for the HX model through scale development

    Get PDF
    Haptic technology allows one to receive tactile information through the sense of touch. Increasingly, designers and researchers are employing haptic feedback with the aim to improve user experience (UX). While they see the importance and significance of including haptic feedback in everyday applications, there is a lack of standardized tools to assess the quality of these experiences. They currently use qualitative methods or demos for obtaining user feedback; neither approach scales to large studies or remote work. We aim to bridge this gap and complement the existing approaches by developing an instrument that comprehensively measures haptic user experience. We follow a systematic scale development framework to build, evaluate and establish a first draft of the haptic user experience scale - the Haptic eXperience Index (HXI), which has the potential to measure the effectiveness of haptic experiences. This scale is built upon the recent Haptic Experience (HX) model and it contributes a novel instrument that measures the five foundational constructs for designing haptic experiences: Harmony, Expressivity, Autotelics, Immersion, and Realism. We iteratively developed a set of 20 questions through a series of studies: expert reviews (N=6), face validity (N=8), cognitive interviews (N=9), and exploratory factor analysis (N=261). Our results provide evidence for the HX model’s five factors, with an enriched description of each factor, and implications for how to measure HX, including a first proposed draft of the HXI. In this process, we gained an in-depth understanding of the factors we considered for developing HXI; what applications can be chosen for representing a rather diverse set of experiences; understand the limitations, and define future work. This HXI is a steppingstone towards a generalized evaluation tool to measure haptic experience

    Understanding Hand Interactions and Mid-Air Haptic Responses within Virtual Reality and Beyond.

    Get PDF
    Hand tracking has long been seen as a futuristic interaction, firmly situated into the realms of sci-fi. Recent developments and technological advancements have brought that dream into reality, allowing for real-time interactions by naturally moving and positioning your hand. While these developments have enabled numerous research projects, it is only recently that businesses and devices are truly starting to implement and integrate the technology into their different sectors. Numerous devices are shifting towards a fully self- contained ecosystem, where the removal of controllers could significantly help in reducing barriers to entry. Prior studies have focused on the effects or possible areas for implementation of hand tracking, but rarely focus on the direct comparisons of technologies, nor do they attempt to reproduce lost capabilities. With this prevailing background, the work presented in this thesis aims to understand the benefits and negatives of hand tracking when treated as the primary interaction method within virtual reality (VR) environments. Coupled with this, the implementation and usage of novel mid-air ultrasound-based haptics attempt to reintroduce feedback that would have been achieved through conventional controller interactions. Two unique user studies were undertaken, testing core underlying interactions within VR that represent common instances found throughout simulations. The first study focuses on the interactions presented within 3D VR user interfaces, with a core topic of buttons. While the second study directly compares input and haptic modalities within two different fine motor skill tasks. These studies are coupled with the development and implementation of a real-time user study recording toolkit, allowing for significantly heightened user analysis and visual evaluation of interactions. Results from these studies and developments make valuable contributions to the research and business knowledge of hand tracking interactions, as well as providing a uniquely valuable open-source toolkit for other researchers to use. This thesis covers work undertaken at Ultraleap over varying projects between 2018 and 2021
    corecore