798 research outputs found

    Haptic System for Eyes Free and Hands Free Pedestrian Navigation

    Get PDF
    International audienceUntil now, Augmented Reality was mainly associated with visual augmentation which was often reduced to superimposing a virtual object on to a real world. We present in this document a vibro-tactile system called HaptiNav, which illustrates the concept of Haptic Augmented Reality. We use the haptic feedback method to send users information about their direction, thus enabling them to reach their destination. To do so, we use a turn by turn metaphor which consists of dividing the route into many reference points. In order to assess the performances of the HaptiNav system, we carry out an experimental study in which we compare it to both Google Maps Audio and Pocket Navigator systems. The results show that there is no significant difference between HaptiNav and Google Maps Audio in terms of performance, physical load and time. However, statistical analysis of the mental load, frustration and effort highlights the advantages of HaptiNav compared to two other systems. In the light of the results obtained, we present possible improvements for HaptiNav and describe its second prototype, at the end of this paper

    Expressive haptics for enhanced usability of mobile interfaces in situations of impairments

    Get PDF
    Designing for situational awareness could lead to better solutions for disabled people, likewise, exploring the needs of disabled people could lead to innovations that can address situational impairments. This in turn can create non-stigmatising assistive technology for disabled people from which eventually everyone could benefit. In this paper, we investigate the potential for advanced haptics to compliment the graphical user interface of mobile devices, thereby enhancing user experiences of all people in some situations (e.g. sunlight interfering with interaction) and visually impaired people. We explore technical solutions to this problem space and demonstrate our justification for a focus on the creation of kinaesthetic force feedback. We propose initial design concepts and studies, with a view to co-create delightful and expressive haptic interactions with potential users motivated by scenarios of situational and permanent impairments.Comment: Presented at the CHI'19 Workshop: Addressing the Challenges of Situationally-Induced Impairments and Disabilities in Mobile Interaction, 2019 (arXiv:1904.05382

    Testing Two Tools for Multimodal Navigation

    Get PDF
    The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment

    Eyes-Off Physically Grounded Mobile Interaction

    Get PDF
    This thesis explores the possibilities, challenges and future scope for eyes-off, physically grounded mobile interaction. We argue that for interactions with digital content in physical spaces, our focus should not be constantly and solely on the device we are using, but fused with an experience of the places themselves, and the people who inhabit them. Through the design, development and evaluation of a series ofnovel prototypes we show the benefits of a more eyes-off mobile interaction style.Consequently, we are able to outline several important design recommendations for future devices in this area.The four key contributing chapters of this thesis each investigate separate elements within this design space. We begin by evaluating the need for screen-primary feedback during content discovery, showing how a more exploratory experience can be supported via a less-visual interaction style. We then demonstrate how tactilefeedback can improve the experience and the accuracy of the approach. In our novel tactile hierarchy design we add a further layer of haptic interaction, and show how people can be supported in finding and filtering content types, eyes-off. We then turn to explore interactions that shape the ways people interact with aphysical space. Our novel group and solo navigation prototypes use haptic feedbackfor a new approach to pedestrian navigation. We demonstrate how variations inthis feedback can support exploration, giving users autonomy in their navigationbehaviour, but with an underlying reassurance that they will reach the goal.Our final contributing chapter turns to consider how these advanced interactionsmight be provided for people who do not have the expensive mobile devices that areusually required. We extend an existing telephone-based information service to support remote back-of-device inputs on low-end mobiles. We conclude by establishingthe current boundaries of these techniques, and suggesting where their usage couldlead in the future

    Integrating Haptic Feedback into Mobile Location Based Services

    Get PDF
    Haptics is a feedback technology that takes advantage of the human sense of touch by applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile phone. Historically, human-computer interaction has been visual - text and images on the screen. Haptic feedback can be an important additional method especially in Mobile Location Based Services such as knowledge discovery, pedestrian navigation and notification systems. A knowledge discovery system called the Haptic GeoWand is a low interaction system that allows users to query geo-tagged data around them by using a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation system for walkers. Four prototypes have been developed classified according to the user’s guidance requirements, the user type (based on spatial skills), and overall system complexity. Haptic Transit is a notification system that provides spatial information to the users of public transport. In all these systems, haptic feedback is used to convey information about location, orientation, density and distance by use of the vibration alarm with varying frequencies and patterns to help understand the physical environment. Trials elicited positive responses from the users who see benefit in being provided with a “heads up” approach to mobile navigation. Results from a memory recall test show that the users of haptic feedback for navigation had better memory recall of the region traversed than the users of landmark images. Haptics integrated into a multi-modal navigation system provides more usable, less distracting but more effective interaction than conventional systems. Enhancements to the current work could include integration of contextual information, detailed large-scale user trials and the exploration of using haptics within confined indoor spaces

    Haptic system for eyes free and hands free pedestrian navigation

    Get PDF
    Until now, Augmented Reality was mainly associated with visual augmentation which was often reduced to superimposing a virtual object on to a real world. We present in this document a vibro-tactile system called HaptiNav, which illustrates the concept of Haptic Augmented Reality. We use the haptic feedback method to send users information about their direction, thus enabling them to reach their destination. To do so, we use a turn by turn metaphor which consists of dividing the route into many reference points. In order to assess the performances of the HaptiNav system, we carry out an experimental study in which we compare it to both Google Maps Audio and Pocket Navigator systems. The results show that there is no significant difference between HaptiNav and Google Maps Audio in terms of performance, physical load and time. However, statistical analysis of the mental load, frustration and effort highlights the advantages of HaptiNav compared to two other systems. In the light of the results obtained, we present possible improvements for HaptiNav and describe its second prototype, at the end of this paper

    Quick-Glance and In-Depth exploration of a tabletop map for visually impaired people

    Get PDF
    National audienceInteractive tactile maps provide visually impaired people with accessible geographic information. However, when these maps are presented on large tabletops, tactile exploration without sight is long and tedious due to the size of the surface. In this paper we present a novel approach to speed up the process of exploring tabletop maps in the absence of vision. Our approach mimics the visual processing of a map and consists in two steps. First, the Quick-Glance step allows creating a global mental representation of the map by using mid-air gestures. Second, the In-Depth step allows users to reach Points of Interest with appropriate hand guidance onto the map. In this paper we present the design and development of a prototype combining a smartwatch and a tactile surface for Quick-Glance and In-Depth interactive exploration of a map

    An Evaluation of Radar Metaphors for Providing Directional Stimuli Using Non-Verbal Sound

    Get PDF
    We compared four audio-based radar metaphors for providing directional stimuli to users of AR headsets. The metaphors are clock face, compass, white noise, and scale. Each metaphor, or method, signals the movement of a virtual arm in a radar sweep. In a user study, statistically significant differences were observed for accuracy and response time. Beat-based methods (clock face, compass) elicited responses biased to the left of the stimulus location, and non-beat-based methods (white noise, scale) produced responses biased to the right of the stimulus location. The beat methods were more accurate than the non-beat methods. However, the non-beat methods elicited quicker responses. We also discuss how response accuracy varies along the radar sweep between methods. These observations contribute design insights for non-verbal, nonvisual directional prompting
    • …
    corecore