622 research outputs found

    Constructing sonified haptic line graphs for the blind student: first steps

    Get PDF
    Line graphs stand as an established information visualisation and analysis technique taught at various levels of difficulty according to standard Mathematics curricula. It has been argued that blind individuals cannot use line graphs as a visualisation and analytic tool because they currently primarily exist in the visual medium. The research described in this paper aims at making line graphs accessible to blind students through auditory and haptic media. We describe (1) our design space for representing line graphs, (2) the technology we use to develop our prototypes and (3) the insights from our preliminary work

    EyeRing: A Finger-Worn Input Device for Seamless Interactions with Our Surroundings

    Get PDF
    Finger-worn interfaces remain a vastly unexplored space for user interfaces, despite the fact that our fingers and hands are naturally used for referencing and interacting with the environment. In this paper we present design guidelines and implementation of a finger-worn I/O device, the EyeRing, which leverages the universal and natural gesture of pointing. We present use cases of EyeRing for both visually impaired and sighted people. We discuss initial reactions from visually impaired users which suggest that EyeRing may indeed offer a more seamless solution for dealing with their immediate surroundings than the solutions they currently use. We also report on a user study that demonstrates how EyeRing reduces effort and disruption to a sighted user. We conclude that this highly promising form factor offers both audiences enhanced, seamless interaction with information related to objects in the environment.Singapore University of Technology and Design. International Design Center (IDC grant IDG31100104A)Singapore University of Technology and Design. International Design Center (IDC grant IDD41100102A

    Virtual Reality as Navigation Tool: Creating Interactive Environments For Individuals With Visual Impairments

    Get PDF
    Research into the creation of assistive technologies is increasingly incorporating the use of virtual reality experiments. One area of application is as an orientation and mobility assistance tool for people with visual impairments. Some of the challenges are developing useful knowledge of the user’s surroundings and effectively conveying that information to the user. This thesis examines the feasibility of using virtual environments conveyed via auditory feedback as part of an autonomous mobility assistance system. Two separate experiments were conducted to study key aspects of a potential system: navigation assistance and map generation. The results of this research include mesh models that were fitted to the walk pathways of an environment, and collected data that provide insights on the viability of virtual reality based guidance systems

    A Systematic Review of Extended Reality (XR) for Understanding and Augmenting Vision Loss

    Full text link
    Over the past decade, extended reality (XR) has emerged as an assistive technology not only to augment residual vision of people losing their sight but also to study the rudimentary vision restored to blind people by a visual neuroprosthesis. To make the best use of these emerging technologies, it is valuable and timely to understand the state of this research and identify any shortcomings that are present. Here we present a systematic literature review of 227 publications from 106 different venues assessing the potential of XR technology to further visual accessibility. In contrast to other reviews, we sample studies from multiple scientific disciplines, focus on augmentation of a person's residual vision, and require studies to feature a quantitative evaluation with appropriate end users. We summarize prominent findings from different XR research areas, show how the landscape has changed over the last decade, and identify scientific gaps in the literature. Specifically, we highlight the need for real-world validation, the broadening of end-user participation, and a more nuanced understanding of the suitability and usability of different XR-based accessibility aids. By broadening end-user participation to early stages of the design process and shifting the focus from behavioral performance to qualitative assessments of usability, future research has the potential to develop XR technologies that may not only allow for studying vision loss, but also enable novel visual accessibility aids with the potential to impact the lives of millions of people living with vision loss

    Current Use and Future Perspectives of Spatial Audio Technologies in Electronic Travel Aids

    Get PDF
    Electronic travel aids (ETAs) have been in focus since technology allowed designing relatively small, light, and mobile devices for assisting the visually impaired. Since visually impaired persons rely on spatial audio cues as their primary sense of orientation, providing an accurate virtual auditory representation of the environment is essential. This paper gives an overview of the current state of spatial audio technologies that can be incorporated in ETAs, with a focus on user requirements. Most currently available ETAs either fail to address user requirements or underestimate the potential of spatial sound itself, which may explain, among other reasons, why no single ETA has gained a widespread acceptance in the blind community. We believe there is ample space for applying the technologies presented in this paper, with the aim of progressively bridging the gap between accessibility and accuracy of spatial audio in ETAs.This project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement no. 643636.Peer Reviewe

    A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research

    Get PDF
    This paper summarizes recent developments in audio and tactile feedback based assistive technologies targeting the blind community. Current technology allows applications to be efficiently distributed and run on mobile and handheld devices, even in cases where computational requirements are significant. As a result, electronic travel aids, navigational assistance modules, text-to-speech applications, as well as virtual audio displays which combine audio with haptic channels are becoming integrated into standard mobile devices. This trend, combined with the appearance of increasingly user- friendly interfaces and modes of interaction has opened a variety of new perspectives for the rehabilitation and training of users with visual impairments. The goal of this paper is to provide an overview of these developments based on recent advances in basic research and application development. Using this overview as a foundation, an agenda is outlined for future research in mobile interaction design with respect to users with special needs, as well as ultimately in relation to sensor-bridging applications in genera

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die InformationsĂŒberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den SinneskanĂ€len aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschĂ€tzen, wenn mehrere AktivitĂ€ten gleichzeitig ablaufen. Zum Beispiel können Menschen das GerĂ€usch eines herannahenden Autos ĂŒberhören, wenn sie ĂŒber die Straße gehen und dabei auf ihr Smartphone schauen. Einige SinneskanĂ€le können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeintrĂ€chtigt sein. Unter den SinneskanĂ€len wird BerĂŒhrung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren BerĂŒhrungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen fĂŒr die Navigation, FĂŒhrung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile BenutzeroberflĂ€che um den Kopf herum prĂ€sentiert, um einen möglicherweise beeintrĂ€chtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut fĂŒr taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphĂ€rische OberflĂ€che bietet, die die Darstellung prĂ€ziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natĂŒrlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen AnwendungsfĂ€llen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgefĂŒhrt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der FĂ€higkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) AnsĂ€tzen zur Gestaltung taktiler Muster fĂŒr komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und PrĂ€senzgrades von Virtual-Reality-Anwendungen und (e) der UnterstĂŒtzung von Menschen mit Sehbehinderungen bei der FĂŒhrung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusĂ€tzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine GrĂ¶ĂŸenordnung prĂ€ziser ist als der bisherige Stand der Technik, der einen taktilen GĂŒrtel als Feedback-ModalitĂ€t verwendete. Die FĂ€higkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die LebensqualitĂ€t vieler Menschen mit Sehbehinderungen verbessern
    • 

    corecore