10 research outputs found

    Immersive inclusivity at CHI: Design and creation of inclusive user interactions through immersive media

    Get PDF
    Immersive media is becoming increasingly common in day-to-day scenarios: from extended reality systems to multimodal interfaces. Such ubiquity opens an opportunity for building more inclusive environments for users with disabilities (permanent, temporary, or situational) by either introducing immersive and multimodal elements into existing applications, or designing and creating immersive applications with inclusivity in mind. Thus the aim of this workshop is to create a discussion platform on intersections between the fields of immersive media, accessibility, and human-computer interaction, outline the key current and future problems of immersive inclusive design, and define a set of methodologies for design and evaluation of immersive systems from inclusivity perspective

    Embodied-driven design : a framework to configure body representation & mapping(本文)

    No full text

    Embodied-driven design : a framework to configure body representation & mapping(審査報告)

    No full text

    Embodied-driven design : a framework to configure body representation & mapping(要旨)

    No full text

    Piton: Investigating the Controllability of a Wearable Telexistence Robot

    Get PDF
    The COVID-19 pandemic impacted collaborative activities, travel, and physical contact, increasing the demand for real-time interactions with remote environments. However, the existing remote communication solutions provide limited interactions and do not convey a high sense of presence within a remote environment. Therefore, we propose a snake-shaped wearable telexistence robot, called Piton, that can be remotely used for a variety of collaborative applications. To the best of our knowledge, Piton is the first snake-shaped wearable telexistence robot. We explain the implementation of Piton, its control architecture, and discuss how Piton can be deployed in a variety of contexts. We implemented three control methods to control Piton: HM—using a head-mounted display (HMD), HH—using an HMD and hand-held tracker, and FM—using an HMD and a foot-mounted tracker. We conducted a user study to investigate the applicability of the proposed control methods for telexistence, focusing on body ownership (Alpha IVBO), mental and physical load (NASA-TLX), motion sickness (VRSQ), and a questionnaire to measure user impressions. The results show that both the HM and HH provide relevantly high levels of body ownership, had high perceived accuracy, and were highly favored, whereas the FM control method yielded the lowest body ownership effect and was least favored. We discuss the results and highlight the advantages and shortcomings of the control methods with respect to various potential application contexts. Based on our design and evaluation of Piton, we extracted a number of insights and future research directions to deepen our investigation and realization of wearable telexistence robots

    FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

    No full text
    This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses
    corecore