58 research outputs found

    Using wrist vibrations to guide hand movement and whole body navigation

    Get PDF
    International audienceIn the absence of vision, mobility and orientation are challenging. Audio and tactile feedback can be used to guide visually impaired people. In this paper, we present two complementary studies on the use of vibrational cues for hand guidance during the exploration of itineraries on a map, and whole body-guidance in a virtual environment. Concretely, we designed wearable Arduino bracelets integrating a vibratory motor producing multiple patterns of pulses. In a first study, this bracelet was used for guiding the hand along unknown routes on an interactive tactile map. A wizard-of-Oz study with six blindfolded participants showed that tactons, vibrational patterns, may be more efficient than audio cues for indicating directions. In a second study, this bracelet was used by blindfolded participants to navigate in a virtual environment. The results presented here show that it is possible to significantly decrease travel distance with vibrational cues. To sum up, these preliminary but complementary studies suggest the interest of vibrational feedback in assistive technology for mobility and orientation for blind people

    Haptic System for Eyes Free and Hands Free Pedestrian Navigation

    Get PDF
    International audienceUntil now, Augmented Reality was mainly associated with visual augmentation which was often reduced to superimposing a virtual object on to a real world. We present in this document a vibro-tactile system called HaptiNav, which illustrates the concept of Haptic Augmented Reality. We use the haptic feedback method to send users information about their direction, thus enabling them to reach their destination. To do so, we use a turn by turn metaphor which consists of dividing the route into many reference points. In order to assess the performances of the HaptiNav system, we carry out an experimental study in which we compare it to both Google Maps Audio and Pocket Navigator systems. The results show that there is no significant difference between HaptiNav and Google Maps Audio in terms of performance, physical load and time. However, statistical analysis of the mental load, frustration and effort highlights the advantages of HaptiNav compared to two other systems. In the light of the results obtained, we present possible improvements for HaptiNav and describe its second prototype, at the end of this paper

    A language of tactile motion instructions

    Full text link

    Assisting Navigation and Object Selection with Vibrotactile Cues

    Get PDF
    Our lives have been drastically altered by information technology in the last decades, leading to evolutionary mismatches between human traits and the modern environment. One particular mismatch occurs when visually demanding information technology overloads the perceptual, cognitive or motor capabilities of the human nervous system. This information overload could be partly alleviated by complementing visual interaction with haptics. The primary aim of this thesis was to investigate how to assist movement control with vibrotactile cues. Vibrotactile cues refer to technologymediated vibrotactile signals that notify users of perceptual events, propose users to make decisions, and give users feedback from actions. To explore vibrotactile cues, we carried out five experiments in two contexts of movement control: navigation and object selection. The goal was to find ways to reduce information load in these tasks, thus helping users to accomplish the tasks more effectively. We employed measurements such as reaction times, error rates, and task completion times. We also used subjective rating scales, short interviews, and free-form participant comments to assess the vibrotactile assisted interactive systems. The findings of this thesis can be summarized as follows. First, if the context of movement control allows the use of both feedback and feedforward cues, feedback cues are a reasonable first option. Second, when using vibrotactile feedforward cues, using low-level abstractions and supporting the interaction with other modalities can keep the information load as low as possible. Third, the temple area is a feasible actuation location for vibrotactile cues in movement control, including navigation cues and object selection cues with head turns. However, the usability of the area depends on contextual factors such as spatial congruency, the actuation device, and the pace of the interaction task

    Inclusive Landmark based Pedestrian Wayfinding via Multi-modal Directions

    Get PDF
    Navigational skills are fundamental to travelling from place to place, personal independence and community integration [2]. Current research in pedestrian wayfinding suggests that people vary significantly in their choice of navigation modalities [6, 7, 25]. In addition, pedestrians with learning disabilities find it difficult to recall routes travelled daily and stay oriented while enroute to unknown locations. This paper proposes a wayfinding interface that has 2 components: 1) temporary poly-coated cardboard signage along with imprinted information indicating a specific destination, minutes by foot, directional arrow and a QR code; 2) online interactive website to provide additional contextualized navigation instructions for pedestrians through various modalities. The University of Toronto Scarborough campus (UTSC) is being used as the physical environment to implement and test the proposed wayfinding interface. The QR code tags link the cardboard signage to the online interface and generate streaming of route instructions in the modes of panoramic video, photographs, aerial map, audio or text. The goal of the proposed wayfinding system is to aid UTSC pedestrians - especially those with learning disabilities - to orient themselves and navigate to their destination through multi-modal landmark-based, turn-by-turn directions

    A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired

    Get PDF
    The market penetration of user-centric assistive devices has rapidly increased in the past decades. Growth in computational power, accessibility, and cognitive device capabilities have been accompanied by significant reductions in weight, size, and price, as a result of which mobile and wearable equipment are becoming part of our everyday life. In this context, a key focus of development has been on rehabilitation engineering and on developing assistive technologies targeting people with various disabilities, including hearing loss, visual impairments and others. Applications range from simple health monitoring such as sport activity trackers, through medical applications including sensory (e.g. hearing) aids and real-time monitoring of life functions, to task-oriented tools such as navigational devices for the blind. This paper provides an overview of recent trends in software and hardware-based signal processing relevant to the development of wearable assistive solutions

    Haptic system for eyes free and hands free pedestrian navigation

    Get PDF
    Until now, Augmented Reality was mainly associated with visual augmentation which was often reduced to superimposing a virtual object on to a real world. We present in this document a vibro-tactile system called HaptiNav, which illustrates the concept of Haptic Augmented Reality. We use the haptic feedback method to send users information about their direction, thus enabling them to reach their destination. To do so, we use a turn by turn metaphor which consists of dividing the route into many reference points. In order to assess the performances of the HaptiNav system, we carry out an experimental study in which we compare it to both Google Maps Audio and Pocket Navigator systems. The results show that there is no significant difference between HaptiNav and Google Maps Audio in terms of performance, physical load and time. However, statistical analysis of the mental load, frustration and effort highlights the advantages of HaptiNav compared to two other systems. In the light of the results obtained, we present possible improvements for HaptiNav and describe its second prototype, at the end of this paper

    Eyes-Off Physically Grounded Mobile Interaction

    Get PDF
    This thesis explores the possibilities, challenges and future scope for eyes-off, physically grounded mobile interaction. We argue that for interactions with digital content in physical spaces, our focus should not be constantly and solely on the device we are using, but fused with an experience of the places themselves, and the people who inhabit them. Through the design, development and evaluation of a series ofnovel prototypes we show the benefits of a more eyes-off mobile interaction style.Consequently, we are able to outline several important design recommendations for future devices in this area.The four key contributing chapters of this thesis each investigate separate elements within this design space. We begin by evaluating the need for screen-primary feedback during content discovery, showing how a more exploratory experience can be supported via a less-visual interaction style. We then demonstrate how tactilefeedback can improve the experience and the accuracy of the approach. In our novel tactile hierarchy design we add a further layer of haptic interaction, and show how people can be supported in finding and filtering content types, eyes-off. We then turn to explore interactions that shape the ways people interact with aphysical space. Our novel group and solo navigation prototypes use haptic feedbackfor a new approach to pedestrian navigation. We demonstrate how variations inthis feedback can support exploration, giving users autonomy in their navigationbehaviour, but with an underlying reassurance that they will reach the goal.Our final contributing chapter turns to consider how these advanced interactionsmight be provided for people who do not have the expensive mobile devices that areusually required. We extend an existing telephone-based information service to support remote back-of-device inputs on low-end mobiles. We conclude by establishingthe current boundaries of these techniques, and suggesting where their usage couldlead in the future

    Supporting Eyes-Free Human–Computer Interaction with Vibrotactile Haptification

    Get PDF
    The sense of touch is a crucial sense when using our hands in complex tasks. Some tasks we learn to do even without sight by just using the sense of touch in our fingers and hands. Modern touchscreen devices, however, have lost some of that tactile feeling while removing physical controls from the interaction. Touch is also a sense that is underutilized in interactions with technology and could provide new ways of interaction to support users. While users are using information technology in certain situations, they cannot visually and mentally focus completely during the interaction. Humans can utilize their sense of touch more comprehensively in interactions and learn to understand tactile information while interacting with information technology. This thesis introduces a set of experiments that evaluate human capabilities to understand and notice tactile information provided by current actuator technology and further introduces a couple of examples of haptic user interfaces (HUIs) to use under eyes-free use scenarios. These experiments evaluate the benefits of such interfaces for users and concludes with some guidelines and methods for how to create this kind of user interfaces. The experiments in this thesis can be divided into three groups. In the first group, with the first two experiments, the detection of vibrotactile stimuli and interpretation of the abstract meaning of vibrotactile feedback was evaluated. Experiments in the second group evaluated how to design rhythmic vibrotactile tactons to be basic vibrotactile primitives for HUIs. The last group of two experiments evaluated how these HUIs benefit the users in the distracted and eyes-free interaction scenarios. The primary aim for this series of experiments was to evaluate if utilizing the current level of actuation technology could be used more comprehensively than in current-day solutions with simple haptic alerts and notifications. Thus, to find out if the comprehensive use of vibrotactile feedback in interactions would provide additional benefits for the users, compared to the current level of haptic interaction methods and nonhaptic interaction methods. The main finding of this research is that while using more comprehensive HUIs in eyes-free distracted-use scenarios, such as while driving a car, the user’s main task, driving, is performed better. Furthermore, users liked the comprehensively haptified user interfaces

    Integrating Haptic Feedback into Mobile Location Based Services

    Get PDF
    Haptics is a feedback technology that takes advantage of the human sense of touch by applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile phone. Historically, human-computer interaction has been visual - text and images on the screen. Haptic feedback can be an important additional method especially in Mobile Location Based Services such as knowledge discovery, pedestrian navigation and notification systems. A knowledge discovery system called the Haptic GeoWand is a low interaction system that allows users to query geo-tagged data around them by using a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation system for walkers. Four prototypes have been developed classified according to the user’s guidance requirements, the user type (based on spatial skills), and overall system complexity. Haptic Transit is a notification system that provides spatial information to the users of public transport. In all these systems, haptic feedback is used to convey information about location, orientation, density and distance by use of the vibration alarm with varying frequencies and patterns to help understand the physical environment. Trials elicited positive responses from the users who see benefit in being provided with a “heads up” approach to mobile navigation. Results from a memory recall test show that the users of haptic feedback for navigation had better memory recall of the region traversed than the users of landmark images. Haptics integrated into a multi-modal navigation system provides more usable, less distracting but more effective interaction than conventional systems. Enhancements to the current work could include integration of contextual information, detailed large-scale user trials and the exploration of using haptics within confined indoor spaces
    • …
    corecore