11,409 research outputs found
Tac-tiles: multimodal pie charts for visually impaired users
Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which is augmented with a tangible overlay tile to guide user exploration. Dynamic feedback is provided by a tactile pin-array at the fingertips, and through speech/non-speech audio cues. In designing the system, we seek to preserve the affordances and metaphors of traditional, low-tech teaching media for the blind, and combine this with the benefits of a digital representation. Traditional tangible media allow rapid, non-sequential access to data, promote easy and unambiguous access to resources such as axes and gridlines, allow the use of external memory, and preserve visual conventions, thus promoting collaboration with sighted colleagues. A prototype system was evaluated with visually impaired users, and recommendations for multimodal design were derived
UEFI BIOS Accessibility for the Visually Impaired
People with some kind of disability face a high level of difficulty for
everyday tasks because, in many cases, accessibility was not considered
necessary when the task or process was designed. An example of this scenario is
a computer's BIOS configuration screens, which do not consider the specific
needs, such as screen readers, of visually impaired people. This paper proposes
the idea that it is possible to make the pre-operating system environment
accessible to visually impaired people. We report our work-in-progress in
creating a screen reader prototype, accessing audio cards compatible with the
High Definition Audio specification in systems running UEFI compliant firmware.Comment: 6 page
BrlAPI: Simple, Portable, Concurrent, Application-level Control of Braille Terminals
Screen readers can drive braille devices for allowing visually impaired users
to access computer environments, by providing them the same information as
sighted users. But in some cases, this view is not easy to use on a braille
device. In such cases, it would be much more useful to let applications provide
their own braille feedback, specially adapted to visually impaired users. Such
applications would then need the ability to output braille ; however, allowing
both screen readers and applications access a wide panel of braille devices is
not a trivial task. We present an abstraction layer that applications may use
to communicate with braille devices. They do not need to deal with the
specificities of each device, but can do so if necessary. We show how several
applications can communicate with one braille device concurrently, with BrlAPI
making sensible choices about which application eventually gets access to the
device. The description of a widely used implementation of BrlAPI is included
Feeling what you hear: tactile feedback for navigation of audio graphs
Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical data, but the current methods of accessing them are highly visually mediated. Representing data using audio feedback is a common method of making data more accessible, but methods of navigating and accessing the data are often serial in nature and laborious. Tactile or haptic displays could be used to provide additional feedback to support a point-and-click type interaction for the visually impaired. A requirements capture conducted with sight impaired computer users produced a review of current accessibility technologies, and guidelines were extracted for using tactile feedback to aid navigation. The results of a qualitative evaluation with a prototype interface are also presented. Providing an absolute position input device and tactile feedback allowed the users to explore the graph using tactile and proprioceptive cues in a manner analogous to point-and-click techniques
Non-visual overviews of complex data sets
This paper describes the design and preliminary testing of an interface to obtain overview information from complex numerical data tables non-visually, which is something that cannot be done with currently available accessibility tools for the blind and visually impaired users. A sonification technique that hides detail in the data and highlights its main features without doing any computations to the data, is combined with a graphics tablet for focus+context interactive navigation, in an interface called TableVis. Results from its evaluation suggest that this technique can deliver better scores than speech in time to answer overview questions, correctness of the answers and subjective workload
Visually-impaired people studying via ebook: investigating current use and potential for improvement
Everyday activities and tasks should be easy to perform for everyone, especially in an educational context, in order to foster inclusivity and assure equal opportunities for all. In this paper, we investigate strategies and issues experienced by visually impaired people when studying via eBook. An online survey was designed to investigate preferences regarding the different formats and understand what types of actions are possible and desirable when using eBooks in an educational context. We collected the views and experiences of 75 visually-impaired people, which revealed the need to develop tools that can provide both full accessibility and high usability when reading for study. Visually impaired people would like to rely on the same widely used strategies that sighted people use when studying a text. In addition, 92% of the visually-impaired people participating in the online survey declared they were interested in a (new) reading app. The results could orient the design of new digital reading tools and functionalities that can improve interaction
- …