358 research outputs found

    Paralinguistic vocal control of interactive media: how untapped elements of voice might enhance the role of non-speech voice input in the user's experience of multimedia.

    Get PDF
    Much interactive media development, especially commercial development, implies the dominance of the visual modality, with sound as a limited supporting channel. The development of multimedia technologies such as augmented reality and virtual reality has further revealed a distinct partiality to visual media. Sound, however, and particularly voice, have many aspects which have yet to be adequately investigated. Exploration of these aspects may show that sound can, in some respects, be superior to graphics in creating immersive and expressive interactive experiences. With this in mind, this thesis investigates the use of non-speech voice characteristics as a complementary input mechanism in controlling multimedia applications. It presents a number of projects that employ the paralinguistic elements of voice as input to interactive media including both screen-based and physical systems. These projects are used as a means of exploring the factors that seem likely to affect users’ preferences and interaction patterns during non-speech voice control. This exploration forms the basis for an examination of potential roles for paralinguistic voice input. The research includes the conceptual and practical development of the projects and a set of evaluative studies. The work submitted for Ph.D. comprises practical projects (50 percent) and a written dissertation (50 percent). The thesis aims to advance understanding of how voice can be used both on its own and in combination with other input mechanisms in controlling multimedia applications. It offers a step forward in the attempts to integrate the paralinguistic components of voice as a complementary input mode to speech input applications in order to create a synergistic combination that might let the strengths of each mode overcome the weaknesses of the other

    Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality:Robustness and User Experience

    Get PDF
    Eye tracking is routinely being incorporated into virtual reality (VR) systems. Prior research has shown that eye tracking data, if exposed, can be used for re-identification attacks [14]. The state of our knowledge about currently existing privacy mechanisms is limited to privacy-utility trade-off curves based on data-centric metrics of utility, such as prediction error, and black-box threat models. We propose that for interactive VR applications, it is essential to consider user-centric notions of utility and a variety of threat models. We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications that incorporate subjective user experience and task performance metrics. We evaluate selected privacy mechanisms using this methodology and find that re-identification accuracy can be decreased to as low as 14% while maintaining a high usability score and reasonable task performance. Finally, we elucidate three threat scenarios (black-box, black-box with exemplars, and white-box) and assess how well the different privacy mechanisms hold up to these adversarial scenarios. This work advances the state of the art in VR privacy by providing a methodology for end-to-end assessment of the risk of re-identification attacks and potential mitigating solutions

    Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience

    Full text link
    Eye tracking is routinely being incorporated into virtual reality (VR) systems. Prior research has shown that eye tracking data, if exposed, can be used for re-identification attacks. The state of our knowledge about currently existing privacy mechanisms is limited to privacy-utility trade-off curves based on data-centric metrics of utility, such as prediction error, and black-box threat models. We propose that for interactive VR applications, it is essential to consider user-centric notions of utility and a variety of threat models. We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications that incorporate subjective user experience and task performance metrics. We evaluate selected privacy mechanisms using this methodology and find that re-identification accuracy can be decreased to as low as 14% while maintaining a high usability score and reasonable task performance. Finally, we elucidate three threat scenarios (black-box, black-box with exemplars, and white-box) and assess how well the different privacy mechanisms hold up to these adversarial scenarios. This work advances the state of the art in VR privacy by providing a methodology for end-to-end assessment of the risk of re-identification attacks and potential mitigating solutions.Comment: To appear in IEEE Transactions on Visualization and Computer Graphic

    Defining Haptic Experience: Foundations for Understanding, Communicating, and Evaluating HX

    Get PDF
    Haptic technology is maturing, with expectations and evidence that it will contribute to user experience (UX). However, we have very little understanding about how haptic technology can influence people’s experience. Researchers and designers need a way to understand, communicate, and evaluate haptic technology’s effect on UX. From a literature review and two studies – one with haptics novices, the other with expert hapticians – we developed a theoretical model of the factors that constitute a good haptic experience (HX). We define HX and propose its constituent factors: design parameters of Timeliness, Density, Intensity, and Timbre; the cross-cutting concern of Personalization; usability requirements of Utility, Causality, Consistency, and Saliency; and experiential factors of Harmony, Expressivity, Autotelics, Immersion, and Realism as guiding constructs important for haptic experience. This model will help guide design and research of haptic systems, inform language around haptics, and provide the basis for evaluative instruments, such as checklists, heuristics, or questionnaires.We acknowledge the support of the Natural Sciences and Engineering Research Council of Canada (NSERC), [funding reference number 2019-06589

    Chapter 13 Haptic Creatures

    Get PDF
    Collaborations between entertainment industries and artificial intelligence researchers in Japan have since the mid-1990s produced a growing interest in modeling affect and emotion for use in mass-produced social robots. Robot producers and marketers reason that such robot companions can provide comfort, healing (iyashi), and intimacy in light of attenuating social bonds and increased socioeconomic stress characteristic of Japanese society since the collapse of the country’s bubble economy in the early 1990s. While many of these robots with so-called “artificial emotional intelligence” are equipped with rudimentary capacities to “read” predefined human emotion through such mechanisms as facial expression recognition, a new category of companion robots are more experimental. These robots do not interpret human emotion through affect-sensing software but rather invite human-robot interaction through affectively pleasing forms of haptic feedback. These new robots are called haptic creatures: robot companions designed to deliver a sense of comforting presence through a combination of animated movements and healing touch. Integrating historical analysis with ethnographic interviews with new users of these robots, and focusing in particular on the cat-like cushion robot Qoobo, this chapter argues that while companion robots are designed in part to understand specific human emotions, haptic creatures are created as experimental devices that can generate new and unexpected pleasures of affective care unique to human-robot relationships. It suggests that this distinction is critical for understanding and evaluating how corporations seek to use human-robot affect as a means to deliver care to consumers while also researching and building new markets for profit maximization

    Exploring the Design Space of Extra-Linguistic Expression for Robots

    Full text link
    In this paper, we explore the new design space of extra-linguistic cues inspired by graphical tropes used in graphic novels and animation to enhance the expressiveness of social robots. To achieve this, we identified a set of cues that can be used to generate expressions, including smoke/steam/fog, water droplets, and bubbles. We prototyped devices that can generate these fluid expressions for a robot and conducted design sessions where eight designers explored the use and utility of the cues in conveying the robot's internal states in various design scenarios. Our analysis of the 22 designs, the associated design justifications, and the interviews with designers revealed patterns in how each cue was used, how they were combined with nonverbal cues, and where the participants drew their inspiration from. These findings informed the design of an integrated module called EmoPack, which can be used to augment the expressive capabilities of any robot platform

    Trufax about discussion group netspeak : an historical analysis of semantic change in the English slang of newsgroups and web forums

    Get PDF
    This dissertation offers an examination of the ways in which English writers in Internet discussion groups—that is, newsgroups and web forums—draw upon techniques of semantic change to create slang words. The dissertation hypothesizes that the techniques involved will be similar to those found underpinning semantic change in other varieties of the English language, both the standard form and other, offline varieties of modern slang. However, the precise constraints of the online platform—its basis in text rather than speech and gesture—will also lead to some differences in how the semantic change is carried out. The hypothesis is put to the test using diachronic analyses of 67 slang words, which are represented in data drawn from a variety of Internet discussion groups dating from 1980 to the present day. In analysing these words, the dissertation also examines whether the techniques documented are explainable within established theoretical frameworks of semantic change, particularly whether they conform to the tendencies in the directionality of semantic change identified by Elizabeth Close Traugott as part of her Invited Inferencing Theory of Semantic Change

    Investigating User Experiences Through Animation-based Sketching

    Get PDF
    corecore