1,993 research outputs found

    mBrailler: Multimodal Braille Keyboard for Android

    Get PDF
    Touchscreen devices have paved their way into the mobile scene, presenting a wide set of possibilities but a comparable number of new challenges, particularly for people who are blind. While these devices have a small number of tactile cues, such as buttons, they provide the opportunity to create novel interaction techniques. In this paper, we present mBrailler. mBrailler is a mobile braille keyboard that combines the benefits of physical keyboards (speed and accuracy) and gestural interfaces (flexibility and personalization). We built an 8-button Braille keyboard that can be attached to the back of mainstream smartphones allowing fast and familiar chorded input. On the other hand,the touchscreen enables thumb entered gestures for more complex text editing operations, such as caret movement, text selection, copy, and paste. This project combines the tactile benefits of Braille typewriters with the customization of smartphone applications. We aim to provide a more efficient and effective typing experience for blind users, thus increasing their productivity with current mobile devices

    Survey of Eye-Free Text Entry Techniques of Touch Screen Mobile Devices Designed for Visually Impaired Users

    Get PDF
    Now a days touch screen mobiles are becoming more popular amongst sighted as well visually impaired people due to its simple interface and efficient interaction techniques. Most of the touch screen devices designed for visually impaired users based on screen readers, haptic and different user interface (UI).In this paper we present a critical review of different keypad layouts designed for visually impaired users and their effect on text entry speed. And try to list out key issues to extend accessibility and text entry rate of touch screen devices.Keywords: Text entry rate, touch screen mobile devices, visually impaired users

    Making Spatial Information Accessible on Touchscreens for Users who are Blind and Visually Impaired

    Get PDF
    Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing their usage of smartphones and touchscreens. Although basic access is available, there are still many accessibility issues left to deal with in order to bring full inclusion to this population. One of the important challenges lies in accessing and creating of spatial information on touchscreens. The work presented here provides three new techniques, using three different modalities, for accessing spatial information on touchscreens. The first system makes geometry and diagram creation accessible on a touchscreen through the use of text-to-speech and gestural input. This first study is informed by a qualitative study of how people who are blind and visually impaired currently access and create graphs and diagrams. The second system makes directions through maps accessible using multiple vibration sensors without any sound or visual output. The third system investigates the use of binaural sound on a touchscreen to make various types of applications accessible such as physics simulations, astronomy, and video games

    Quick-Glance and In-Depth exploration of a tabletop map for visually impaired people

    Get PDF
    National audienceInteractive tactile maps provide visually impaired people with accessible geographic information. However, when these maps are presented on large tabletops, tactile exploration without sight is long and tedious due to the size of the surface. In this paper we present a novel approach to speed up the process of exploring tabletop maps in the absence of vision. Our approach mimics the visual processing of a map and consists in two steps. First, the Quick-Glance step allows creating a global mental representation of the map by using mid-air gestures. Second, the In-Depth step allows users to reach Points of Interest with appropriate hand guidance onto the map. In this paper we present the design and development of a prototype combining a smartwatch and a tactile surface for Quick-Glance and In-Depth interactive exploration of a map

    Evaluation of the Accessibility of Touchscreens for Individuals who are Blind or have Low Vision: Where to go from here

    Get PDF
    Touchscreen devices are well integrated into daily life and can be found in both personal and public spaces, but the inclusion of accessible features and interfaces continues to lag behind technology’s exponential advancement. This thesis aims to explore the experiences of individuals who are blind or have low vision (BLV) while interacting with non-tactile touchscreens, such as smartphones, tablets, smartwatches, coffee machines, smart home devices, kiosks, ATM machines, and more. The goal of this research is to create a set of recommended guidelines that can be used in designing and developing either personal devices or shared public technologies with accessible touchscreens. This study consists of three phases, the first being an exploration of existing research related to accessibility of non-tactile touchscreens, followed by semi-structured interviews of 20 BLV individuals to address accessibility gaps in previous work, and finally a survey in order to get a better understanding of the experiences, thoughts, and barriers for BLV individuals while interacting with touchscreen devices. Some of the common themes found include: loss of independence, lack or uncertainty of accessibility features, and the need and desire for improvements. Common approaches for interaction were: the use of high markings, asking for sighted assistance, and avoiding touchscreen devices. These findings were used to create a set of recommended guidelines which include a universal feature setup, the setup of accessibility settings, universal headphone jack position, tactile feedback, ask for help button, situational lighting, and the consideration of time

    Tactile Data Entry for Extravehicular Activity

    Get PDF
    In the task-saturated environment of extravehicular activity (EVA), an astronaut's ability to leverage suit-integrated information systems is limited by a lack of options for data entry. In particular, bulky gloves inhibit the ability to interact with standard computing interfaces such as a mouse or keyboard. This paper presents the results of a preliminary investigation into a system that permits the space suit gloves themselves to be used as data entry devices. Hand motion tracking is combined with simple finger gesture recognition to enable use of a virtual keyboard, while tactile feedback provides touch-based context to the graphical user interface (GUI) and positive confirmation of keystroke events. In human subject trials, conducted with twenty participants using a prototype system, participants entered text significantly faster with tactile feedback than without (p = 0.02). The results support incorporation of vibrotactile information in a future system that will enable full touch typing and general mouse interactions using instrumented EVA gloves
    • …
    corecore