17,159 research outputs found

    Head-Tracking Haptic Computer Interface for the Blind

    Get PDF
    In today’s heavily technology-dependent society, blind and visually impaired people are becoming increasingly disadvantaged in terms of access to media, information, electronic commerce, communications and social networks. Not only are computers becoming more widely used in general, but their dependence on visual output is increasing, extending the technology further out of reach for those without sight. For example, blindness was less of an obstacle for programmers when command-line interfaces were more commonplace, but with the introduction of Graphical User Interfaces (GUIs) for both development and final applications, many blind programmers were made redundant (Alexander, 1998; Siegfried et al., 2004). Not only are images, video and animation heavily entrenched in today’s interfaces, but the visual layout of the interfaces themselves hold important information which is inaccessible to sightless users with existing accessibility technology

    Improving the Graphical User Interface (GUI) for the Dynamic Feedback Signal Set (DyFSS): Increasing Accessibility for the Neurodiverse

    Get PDF
    Peripheral biofeedback is an explicit learning tool that allows for real-time evaluation and control of physiological proxies by means of computerized signals. Its integration into health practice allows users to calibrate self-awareness and self regulation then apply these skills to everyday life. People with neurodevelopmental differences encounter limitations when using commercially available clinical biofeedback due to variation in their autonomic response. Principles of Universal Design dictate that biofeedback inputs and displays allow effective access and benefit for as many individuals as possible. Our Dynamic Feedback Signal Set (DyFSS, nonprovisional patent-in-process) algorithm adjusts signal processing by dynamically weighting feedback signals to the best abilities of the user, increasing the efficacy of biofeedback for the neurodiverse. The software includes an interactive graphical tutorial and quiz, a variety of graphical user interfaces to honor individual preferences and abilities, and a game that can be played by blind and hard of hearing individuals alike

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Rotate-and-Press: A Non-Visual Alternative to Point-and-Click

    Get PDF
    Most computer applications manifest visually rich and dense graphical user interfaces (GUIs) that are primarily tailored for an easy-and-efficient sighted interaction using a combination of two default input modalities, namely the keyboard and the mouse/touchpad. However, blind screen-reader users predominantly rely only on keyboard, and therefore struggle to interact with these applications, since it is both arduous and tedious to perform the visual \u27point-and-click\u27 tasks such as accessing the various application commands/features using just keyboard shortcuts supported by screen readers. In this paper, we investigate the suitability of a \u27rotate-and-press\u27 input modality as an effective non-visual substitute for the visual mouse to easily interact with computer applications, with specific focus on word processing applications serving as the representative case study. In this regard, we designed and developed bTunes, an add-on for Microsoft Word that customizes an off-the-shelf Dial input device such that it serves as a surrogate mouse for blind screen-reader users to quickly access various application commands and features using a set of simple rotate and press gestures supported by the Dial. Therefore, with bTunes, blind users too can now enjoy the benefits of two input modalities, as their sighted counterparts. A user study with 15 blind participants revealed that bTunes significantly reduced both the time and number of user actions for doing representative tasks in a word processing application, by as much as 65.1% and 36.09% respectively. The participants also stated that they did not face any issues switching between keyboard and Dial, and furthermore gave a high usability rating (84.66 avg. SUS score) for bTunes

    Web-based haptic applications for blind people to create virtual graphs

    Get PDF
    Haptic technology has great potentials in many applications. This paper introduces our work on delivery haptic information via the Web. A multimodal tool has been developed to allow blind people to create virtual graphs independently. Multimodal interactions in the process of graph creation and exploration are provided by using a low-cost haptic device, the Logitech WingMan Force Feedback Mouse, and Web audio. The Web-based tool also provides blind people with the convenience of receiving information at home. In this paper, we present the development of the tool and evaluation results. Discussions on the issues related to the design of similar Web-based haptic applications are also given

    Music and Speech in Auditory Interfaces: When is One Mode More Appropriate Than the Other?

    Get PDF
    A number of experiments, which have been carried out using non-speech auditory interfaces, are reviewed and the advantages and disadvantages of each are discussed. The possible advantages of using non-speech audio media such as music are discussed – richness of the representations possible, the aesthetic appeal, and the possibilities of such interfaces being able to handle abstraction and consistency across the interface

    Impact of haptic 'touching' technology on cultural applications

    Get PDF
    No abstract available

    Web-based multimodal graphs for visually impaired people

    Get PDF
    This paper describes the development and evaluation of Web-based multimodal graphs designed for visually impaired and blind people. The information in the graphs is conveyed to visually impaired people through haptic and audio channels. The motivation of this work is to address problems faced by visually impaired people in accessing graphical information on the Internet, particularly the common types of graphs for data visualization. In our work, line graphs, bar charts and pie charts are accessible through a force feedback device, the Logitech WingMan Force Feedback Mouse. Pre-recorded sound files are used to represent graph contents to users. In order to test the usability of the developed Web graphs, an evaluation was conducted with bar charts as the experimental platform. The results showed that the participants could successfully use the haptic and audio features to extract information from the Web graphs
    corecore