3,046 research outputs found

    Feeling what you hear: tactile feedback for navigation of audio graphs

    Get PDF
    Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical data, but the current methods of accessing them are highly visually mediated. Representing data using audio feedback is a common method of making data more accessible, but methods of navigating and accessing the data are often serial in nature and laborious. Tactile or haptic displays could be used to provide additional feedback to support a point-and-click type interaction for the visually impaired. A requirements capture conducted with sight impaired computer users produced a review of current accessibility technologies, and guidelines were extracted for using tactile feedback to aid navigation. The results of a qualitative evaluation with a prototype interface are also presented. Providing an absolute position input device and tactile feedback allowed the users to explore the graph using tactile and proprioceptive cues in a manner analogous to point-and-click techniques

    Exploring the Use of Wearables to develop Assistive Technology for Visually Impaired People

    Get PDF
    This thesis explores the usage of two prominent wearable devices to develop assistive technology for users who are visually impaired. Specifically, the work in this thesis aims at improving the quality of life of users who are visually impaired by improving their mobility and ability to socially interact with others. We explore the use of a smart watch for creating low-cost spatial haptic applications. This app explores the use of haptic feedback provided using a smartwatch and smartphone to provide navigation instructions that let visually impaired people safely traverse a large open space. This spatial feedback guides them to walk on a straight path from source to destination by avoiding veering. Exploring the paired interaction between a Smartphone and a Smartwatch, helped to overcome the limitation that smart devices have only single haptic actuator.We explore the use of a head-mounted display to enhance social interaction by helping people with visual impairments align their head towards a conversation partner as well as maintain personal space during a conversation. Audio feedback is provided to the users guiding them to achieve effective face-to-face communication. A qualitative study of this method shows the effectiveness of the application and explains how it helps visually impaired people to perceive non-verbal cues and feel more engaged and assertive in social interactions

    Expressive haptics for enhanced usability of mobile interfaces in situations of impairments

    Get PDF
    Designing for situational awareness could lead to better solutions for disabled people, likewise, exploring the needs of disabled people could lead to innovations that can address situational impairments. This in turn can create non-stigmatising assistive technology for disabled people from which eventually everyone could benefit. In this paper, we investigate the potential for advanced haptics to compliment the graphical user interface of mobile devices, thereby enhancing user experiences of all people in some situations (e.g. sunlight interfering with interaction) and visually impaired people. We explore technical solutions to this problem space and demonstrate our justification for a focus on the creation of kinaesthetic force feedback. We propose initial design concepts and studies, with a view to co-create delightful and expressive haptic interactions with potential users motivated by scenarios of situational and permanent impairments.Comment: Presented at the CHI'19 Workshop: Addressing the Challenges of Situationally-Induced Impairments and Disabilities in Mobile Interaction, 2019 (arXiv:1904.05382

    Exploring haptic interfacing with a mobile robot without visual feedback

    Get PDF
    Search and rescue scenarios are often complicated by low or no visibility conditions. The lack of visual feedback hampers orientation and causes significant stress for human rescue workers. The Guardians project [1] pioneered a group of autonomous mobile robots assisting a human rescue worker operating within close range. Trials were held with fire fighters of South Yorkshire Fire and Rescue. It became clear that the subjects by no means were prepared to give up their procedural routine and the feel of security they provide: they simply ignored instructions that contradicted their routines

    Web-based multimodal graphs for visually impaired people

    Get PDF
    This paper describes the development and evaluation of Web-based multimodal graphs designed for visually impaired and blind people. The information in the graphs is conveyed to visually impaired people through haptic and audio channels. The motivation of this work is to address problems faced by visually impaired people in accessing graphical information on the Internet, particularly the common types of graphs for data visualization. In our work, line graphs, bar charts and pie charts are accessible through a force feedback device, the Logitech WingMan Force Feedback Mouse. Pre-recorded sound files are used to represent graph contents to users. In order to test the usability of the developed Web graphs, an evaluation was conducted with bar charts as the experimental platform. The results showed that the participants could successfully use the haptic and audio features to extract information from the Web graphs

    Comparing two haptic interfaces for multimodal graph rendering

    Get PDF
    This paper describes the evaluation of two multimodal interfaces designed to provide visually impaired people with access to various types of graphs. The interfaces consist of audio and haptics which is rendered on commercially available force feedback devices. This study compares the usability of two force feedback devices: the SensAble PHANToM and the Logitech WingMan force feedback mouse in representing graphical data. The type of graph used in the experiment is the bar chart under two experimental conditions: single mode and multimodal. The results show that PHANToM provides better performance in the haptic only condition. However, no significant difference has been found between the two devices in the multimodal condition. This has confirmed the advantages of using multimodal approach in our research and that low-cost haptic devices can be successful. This paper introduces our evaluation approach and discusses the findings of the experiment

    Tac-tiles: multimodal pie charts for visually impaired users

    Get PDF
    Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which is augmented with a tangible overlay tile to guide user exploration. Dynamic feedback is provided by a tactile pin-array at the fingertips, and through speech/non-speech audio cues. In designing the system, we seek to preserve the affordances and metaphors of traditional, low-tech teaching media for the blind, and combine this with the benefits of a digital representation. Traditional tangible media allow rapid, non-sequential access to data, promote easy and unambiguous access to resources such as axes and gridlines, allow the use of external memory, and preserve visual conventions, thus promoting collaboration with sighted colleagues. A prototype system was evaluated with visually impaired users, and recommendations for multimodal design were derived

    Two-handed navigation in a haptic virtual environment

    Get PDF
    This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues
    • …
    corecore