4,228 research outputs found

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired

    Get PDF
    The market penetration of user-centric assistive devices has rapidly increased in the past decades. Growth in computational power, accessibility, and cognitive device capabilities have been accompanied by significant reductions in weight, size, and price, as a result of which mobile and wearable equipment are becoming part of our everyday life. In this context, a key focus of development has been on rehabilitation engineering and on developing assistive technologies targeting people with various disabilities, including hearing loss, visual impairments and others. Applications range from simple health monitoring such as sport activity trackers, through medical applications including sensory (e.g. hearing) aids and real-time monitoring of life functions, to task-oriented tools such as navigational devices for the blind. This paper provides an overview of recent trends in software and hardware-based signal processing relevant to the development of wearable assistive solutions

    Testing Two Tools for Multimodal Navigation

    Get PDF
    The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment

    Synchronizing Audio and Haptic to Read Webpage

    Get PDF
    Constantly emerging technologies present new interactive ways to convey information on the Web. The new and enhanced website design has gradually improved sighted users‟ understanding on the Web content but on the other hand, it creates more obstacles to the visually impaired. The significant technological gap in assistive technology and the Web presents on-going challenges to maintain web accessibility, especially for disabled users. The limitations of current assistive technology to convey non-textual information including text attributes such as bold, underline, and italic from the Web further restrict the visually impaired from acquiring comprehensive understanding of the Web content. This project addresses this issues by investigating the problems faced by the visually impaired when using the current assistive technology. The significance of text attributes to support accessibility and improve understanding of the Web content is also being studied. For this purpose several qualitative and quantitative data collection methods are adopted to test the hypotheses. The project also examines the relationship between multimodal technology using audio and haptic modalities and the mental model generated by the visually impaired while accessing webpage. The findings are then used as a framework to develop a system that synchronizes audio and haptic to read webpages and represents text attributes to visually impaired users is to be develop. From the prototype built, pilot testing and user testing are conducted to evaluate the system. The result and recommendations are shared at the end of project for future enhancement

    A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research

    Get PDF
    This paper summarizes recent developments in audio and tactile feedback based assistive technologies targeting the blind community. Current technology allows applications to be efficiently distributed and run on mobile and handheld devices, even in cases where computational requirements are significant. As a result, electronic travel aids, navigational assistance modules, text-to-speech applications, as well as virtual audio displays which combine audio with haptic channels are becoming integrated into standard mobile devices. This trend, combined with the appearance of increasingly user- friendly interfaces and modes of interaction has opened a variety of new perspectives for the rehabilitation and training of users with visual impairments. The goal of this paper is to provide an overview of these developments based on recent advances in basic research and application development. Using this overview as a foundation, an agenda is outlined for future research in mobile interaction design with respect to users with special needs, as well as ultimately in relation to sensor-bridging applications in genera

    The Graphical Access Challenge for People with Visual Impairments: Positions and Pathways Forward

    Get PDF
    Graphical access is one of the most pressing challenges for individuals who are blind or visually impaired. This chapter discusses some of the factors underlying the graphics access challenge, reviews prior approaches to addressing this long-standing information access barrier, and describes some promising new solutions. We specifically focus on touchscreen-based smart devices, a relatively new class of information access technologies, which our group believes represent an exemplary model of user-centered, needs-based design. We highlight both the challenges and the vast potential of these technologies for alleviating the graphics accessibility gap and share the latest results in this line of research. We close with recommendations on ideological shifts in mindset about how we approach solving this vexing access problem, which will complement both technological and perceptual advancements that are rapidly being uncovered through a growing research community in this domain

    A Support System for Graphics for Visually Impaired People

    Get PDF
    As the Internet plays an important role in today’s society, graphics is widely used to present, convey and communicate information in many different areas. Complex information is often easier to understand and analyze by graphics. Even though graphics plays an important role, accessibility support is very limited for web graphics. Web graphics accessibility is not only for people with disabilities, but also for people who want to get and use information in ways different from the ones originally intended. One of the problems regarding graphics for blind people is that we have few data on how a blind person draws or how he/she receives graphical information. Based on Katz’s pupils’ research, one concludes that blind people can draw in outline and that they have a good sense of three-dimensional shape and space. In this thesis, I propose and develop a system, which can serve as a tool to be used by researchers investigating these and related issues. Our support system is built to collect the drawings from visually impaired people by finger movement on Braille devices or touch devices, such as tablets. When the drawing data is collected, the system will automatically generate the graphical XML data, which are easily accessed by applications and web services. The graphical XML data are stored locally or remotely. Compared to other support systems, our system is the first automatic system to provide web services to collect and access such data. The system also has the capability to integrate into cloud computing so that people can use the system anywhere to collect and access the data
    • 

    corecore