14 research outputs found

    Personal digital signage for shared spaces

    No full text
    A wayfinding is an everyday activity where the interaction design has traditionally based on landmarks, visual maps, signs, and social collaboration. In the mobile computing era, we have witnessed more techno-centric development of wayfinding and navigation where people turn to their mobile navigation applications rather than to cues in the surrounding environment. How-ever, in many wayfinding situations, using mobile devices is not very applicable due to safety reasons, indoor limitations or practical needs. To overcome the identified challenges, this paper introduces a personal digital signage, which combines the benefits of traditional directional signs and an underlying mobile technology for wayfinding purposes. The paper begins with formulating the design problem and introducing the premises of the solution. We evaluate and re-fine the solution with usability studies in a mass event (N=24) and in a hurry situation in a campus building (N=58). Test results show that the proposed solution was highly acceptable and rated good in usability among participants. The effectiveness as reaching the target destination was excellent and the efficiency measured as time increased only moderately compared with the optimal performance. We conclude that the solution performs well in indoor spaces where the navigational accuracy depends on the amount and positioning of screens in-stalled as is the case with traditional signs. The study calls for re-thinking the interaction design of navigation and wayfinding without use of mobile devices

    Evaluating the Combination of Visual Communication Cues for HMD-based Mixed Reality Remote Collaboration

    Full text link
    © 2019 Copyright held by the owner/author(s). Many researchers have studied various visual communication cues (e.g. pointer, sketching, and hand gesture) in Mixed Reality remote collaboration systems for real-world tasks. However, the effect of combining them has not been so well explored. We studied the effect of these cues in four combinations: hand only, hand + pointer, hand + sketch, and hand + pointer + sketch, with three problem tasks: Lego, Tangram, and Origami. The study results showed that the participants completed the task significantly faster and felt a significantly higher level of usability when the sketch cue is added to the hand gesture cue, but not with adding the pointer cue. Participants also preferred the combinations including hand and sketch cues over the other combinations. However, using additional cues (pointer or sketch) increased the perceived mental effort and did not improve the feeling of co-presence. We discuss the implications of these results and future research directions

    Exploration of Time-Oriented Data in Immersive Virtual Reality Using a 3D Radar Chart Approach

    No full text
    In this paper, we present an approach to interact with time-oriented data in Virtual Reality within the context of Immersive Analytics. We implemented a Virtual Reality application that enables its user to explore data in an immersive environment (head-mounted display, 3D gestural input), utilizing potential advantages of immersive technologies, for instance, depth cues for better spatial understanding, natural interaction, and user engagement. The visualization design is inspired by the overall concept of a radar chart, and using the third dimension to represent time-series related data. We conducted a user study with 15 participants, encouraging them to examine a representative dataset within an explorative analysis scenario with no time constraints. Based on the results of usability and user engagement scores, task completion analysis, observations, and interviews, we were able to empirically validate the approach in general, and gain insights in the users’ interaction and data analysis strategies.DISA-DHOpen Data Exploration in Virtual Reality (ODxVR

    "Oh, that's where you are!" : Towards a Hybrid Asymmetric Collaborative Immersive Analytics System

    No full text
    We present a hybrid Immersive Analytics system to support asymmetrical collaboration between a pair of users during synchronous data exploration. The system consists of an immersive Virtual Reality application, a non-immersive web application, and a real-time communication interface connecting both applications to provide features to facilitate the collaborators’ mutual understanding and their ability to make (spatial) references. We conducted a real world case study with pairs of language students, encouraging them to use the developed system to investigate a large multivariate Twitter dataset from a sociolinguistic perspective within an explorative analysis scenario. Based on the results of usability scores, log file analyses, observations, and interviews, we were able to validate the approach in general, and gain insights into the users’ collaboration with respect to awareness, deixis, and group dynamics.DISA-DHOpen Data Exploration in Virtual Reality (ODxVR
    corecore