11 research outputs found

    Sound for enhanced experiences in mobile applications

    Get PDF
    When visiting new places you want information about restaurants, shopping, places of historic in- terest etc. Smartphones are perfect tools for de- livering such location-based information, but the risk is that users get absorbed by texts, maps, videos etc. on the device screen and get a second- hand experience of the environment they are vis- iting rather than the sought-after first-hand expe- rience. One problem is that the users’ eyes often are directed to the device screen, rather than to the surrounding environment. Another problem is that interpreting more or less abstract informa- tion on maps, texts, images etc. may take up sig- nificant shares of the users’ overall cognitive re- sources. The work presented here tried to overcome these two problems by studying design for human-computer interaction based on the users’ everyday abilities such as directional hearing and point and sweep gestures. Today’s smartphones know where you are, in what direction you are pointing the device and they have systems for ren- dering spatial audio. These readily available tech- nologies hold the potential to make information more easy to interpret and use, demand less cog- nitive resources and free the users from having to look more or less constantly on a device screen

    Using Sound to Enhance Users’ Experiences of Mobile Applications

    Get PDF
    The latest smartphones with GPS, electronic compass, directional audio, touch screens etc. hold potentials for location based services that are easier to use compared to traditional tools. Rather than interpreting maps, users may focus on their activities and the environment around them. Interfaces may be designed that let users search for information by simply pointing in a direction. Database queries can be created from GPS location and compass direction data. Users can get guidance to locations through pointing gestures, spatial sound and simple graphics. This article describes two studies testing prototypic applications with multimodal user interfaces built on spatial audio, graphics and text. Tests show that users appreciated the applications for their ease of use, for being fun and effective to use and for allowing users to interact directly with the environment rather than with abstractions of the same. The multimodal user interfaces contributed significantly to the overall user experience

    Testing Two Tools for Multimodal Navigation

    Get PDF
    The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment

    Testing Two Tools for Multimodal Navigation

    Get PDF
    The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment

    Accesibilidad de aplicaciones móviles para discapacitados visuales: problemas y estrategias de solución

    Get PDF
    La popularidad de las tecnologías móviles continúa en crecimiento a medida que mejoran su fidelidad, incrementan su capacidad de procesamiento y reducen sus costos. Sin embargo, la tendencia creciente de reemplazar teclados fisicos con pantallas táctiles no solo ha provocado cambios en las modalidades de interacción sino que, para los discapacitados visuales ha significado un incrementado de la complejidad y la difícultad de utilización de aplicaciones móviles. Dado que los desarrolladores de software son en parte responsables de garantizar la accesibilidad de las aplicaciones móviles, en este documento se describen las barreras de accesibilidad que tipicamente enfrentan los discapacitados visuales al utilizar aplicaciones de software mediante dispositivos móviles. Para cada problema de accesibilidad se indican posibles estrategias de solución y se describen aplicaciones móviles implementadas con software libre para aquellos problemas cuya solución va más alla de la aplicación de buenas prácticas de diseño.Sociedad Argentina de Informática e Investigación Operativa (SADIO

    Accesibilidad de aplicaciones móviles para discapacitados visuales: problemas y estrategias de solución

    Get PDF
    La popularidad de las tecnologías móviles continúa en crecimiento a medida que mejoran su fidelidad, incrementan su capacidad de procesamiento y reducen sus costos. Sin embargo, la tendencia creciente de reemplazar teclados fisicos con pantallas táctiles no solo ha provocado cambios en las modalidades de interacción sino que, para los discapacitados visuales ha significado un incrementado de la complejidad y la difícultad de utilización de aplicaciones móviles. Dado que los desarrolladores de software son en parte responsables de garantizar la accesibilidad de las aplicaciones móviles, en este documento se describen las barreras de accesibilidad que tipicamente enfrentan los discapacitados visuales al utilizar aplicaciones de software mediante dispositivos móviles. Para cada problema de accesibilidad se indican posibles estrategias de solución y se describen aplicaciones móviles implementadas con software libre para aquellos problemas cuya solución va más alla de la aplicación de buenas prácticas de diseño.Sociedad Argentina de Informática e Investigación Operativa (SADIO

    Tools in and out of sight : an analysis informed by Cultural-Historical Activity Theory of audio-haptic activities involving people with visual impairments supported by technology

    Get PDF
    The main purpose of this thesis is to present a Cultural-Historical Activity Theory (CHAT) based analysis of the activities conducted by and with visually impaired users supported by audio-haptic technology.This thesis covers several studies conducted in two projects. The studies evaluate the use of audio-haptic technologies to support and/or mediate the activities of people with visual impairment. The focus is on the activities involving access to two-dimensional information, such as pictures or maps. People with visual impairments can use commercially available solutions to explore static information (raised lined maps and pictures, for example). Solu-tions for dynamic access, such as drawing a picture or using a map while moving around, are more scarce. Two distinct projects were initiated to remedy the scarcity of dynamic access solutions, specifically focusing on two separate activities.The first project, HaptiMap, focused on pedestrian outdoors navigation through audio feedback and gestures mediated by a GPS equipped mobile phone. The second project, HIPP, focused on drawing and learning about 2D representations in a school setting with the help of haptic and audio feedback. In both cases, visual feedback was also present in the technology, enabling people with vision to take advantage of that modality too.The research questions addressed are: How can audio and haptic interaction mediate activ-ities for people with visual impairment? Are there features of the programming that help or hinder this mediation? How can CHAT, and specifically the Activity Checklist, be used to shape the design process, when designing audio haptic technology together with persons with visual impairments?Results show the usefulness of the Activity Checklist as a tool in the design process, and provide practical application examples. A general conclusion emphasises the importance of modularity, standards, and libre software in rehabilitation technology to support the development of the activities over time and to let the code evolve with them, as a lifelong iterative development process. The research also provides specific design recommendations for the design of the type of audio haptic systems involved

    Augmenting the Spatial Perception Capabilities of Users Who Are Blind

    Get PDF
    People who are blind face a series of challenges and limitations resulting from their lack of being able to see, forcing them to either seek the assistance of a sighted individual or work around the challenge by way of a inefficient adaptation (e.g. following the walls in a room in order to reach a door rather than walking in a straight line to the door). These challenges are directly related to blind users' lack of the spatial perception capabilities normally provided by the human vision system. In order to overcome these spatial perception related challenges, modern technologies can be used to convey spatial perception data through sensory substitution interfaces. This work is the culmination of several projects which address varying spatial perception problems for blind users. First we consider the development of non-visual natural user interfaces for interacting with large displays. This work explores the haptic interaction space in order to find useful and efficient haptic encodings for the spatial layout of items on large displays. Multiple interaction techniques are presented which build on prior research (Folmer et al. 2012), and the efficiency and usability of the most efficient of these encodings is evaluated with blind children. Next we evaluate the use of wearable technology in aiding navigation of blind individuals through large open spaces lacking tactile landmarks used during traditional white cane navigation. We explore the design of a computer vision application with an unobtrusive aural interface to minimize veering of the user while crossing a large open space. Together, these projects represent an exploration into the use of modern technology in augmenting the spatial perception capabilities of blind users

    Designing Tactile Interfaces for Abstract Interpersonal Communication, Pedestrian Navigation and Motorcyclists Navigation

    Get PDF
    The tactile medium of communication with users is appropriate for displaying information in situations where auditory and visual mediums are saturated. There are situations where a subject's ability to receive information through either of these channels is severely restricted by the environment they are in or through any physical impairments that the subject may have. In this project, we have focused on two groups of users who need sustained visual and auditory focus in their task: Soldiers on the battle field and motorcyclists. Soldiers on the battle field use their visual and auditory capabilities to maintain awareness of their environment to guard themselves from enemy assault. One of the major challenges to coordination in a hazardous environment is maintaining communication between team members while mitigating cognitive load. Compromise in communication between team members may result in mistakes that can adversely affect the outcome of a mission. We have built two vibrotactile displays, Tactor I and Tactor II, each with nine actuators arranged in a three-by-three matrix with differing contact areas that can represent a total of 511 shapes. We used two dimensions of tactile medium, shapes and waveforms, to represent verb phrases and evaluated ability of users to perceive verb phrases the tactile code. We evaluated the effectiveness of communicating verb phrases while the users were performing two tasks simultaneously. The results showed that performing additional visual task did not affect the accuracy or the time taken to perceive tactile codes. Another challenge in coordinating Soldiers on a battle field is navigating them to respective assembly areas. We have developed HaptiGo, a lightweight haptic vest that provides pedestrians both navigational intelligence and obstacle detection capabilities. HaptiGo consists of optimally-placed vibro-tactile sensors that utilize natural and small form factor interaction cues, thus emulating the sensation of being passively guided towards the intended direction. We evaluated HaptiGo and found that it was able to successfully navigate users with timely alerts of incoming obstacles without increasing cognitive load, thereby increasing their environmental awareness. Additionally, we show that users are able to respond to directional information without training. The needs of motorcyclists are di erent from those of Soldiers. Motorcyclists' need to maintain visual and auditory situational awareness at all times is crucial since they are highly exposed on the road. Route guidance systems, such as the Garmin, have been well tested on automobilists, but remain much less safe for use by motorcyclists. Audio/visual routing systems decrease motorcyclists' situational awareness and vehicle control, and thus increase the chances of an accident. To enable motorcyclists to take advantage of route guidance while maintaining situational awareness, we created HaptiMoto, a wearable haptic route guidance system. HaptiMoto uses tactile signals to encode the distance and direction of approaching turns, thus avoiding interference with audio/visual awareness. Evaluations show that HaptiMoto is intuitive for motorcyclists, and a safer alternative to existing solutions

    Overviews and their effect on interaction in the auditory interface.

    Get PDF
    PhDAuditory overviews have the potential to improve the quality of auditory interfaces. However, in order to apply overviews well, we must understand them. Specifically, what are they and what is their impact? This thesis presents six characteristics that overviews should have. They should be a structured representation of the detailed information, define the scope of the material, guide the user, show context and patterns in the data, encourage exploration of the detail and represent the current state of the data. These characteristics are guided by a systematic review of visual overview research, analysis of established visual overviews and evaluation of how these characteristics fit current auditory overviews. The second half of the thesis evaluates how the addition of an overview impacts user interaction. While the overviews do not improve performance, they do change the navigation patterns from one of data exploration and discovery to guided and directed information seeking. With these two contributions, we gain a better understanding of how overviews work in an auditory interface and how they might be exploited more effectively
    corecore