4,934 research outputs found

    Instructional eLearning technologies for the vision impaired

    Get PDF
    The principal sensory modality employed in learning is vision, and that not only increases the difficulty for vision impaired students from accessing existing educational media but also the new and mostly visiocentric learning materials being offered through on-line delivery mechanisms. Using as a reference Certified Cisco Network Associate (CCNA) and IT Essentials courses, a study has been made of tools that can access such on-line systems and transcribe the materials into a form suitable for vision impaired learning. Modalities employed included haptic, tactile, audio and descriptive text. How such a multi-modal approach can achieve equivalent success for the vision impaired is demonstrated. However, the study also shows the limits of the current understanding of human perception, especially with respect to comprehending two and three dimensional objects and spaces when there is no recourse to vision

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Synchronizing Audio and Haptic to Read Webpage

    Get PDF
    Constantly emerging technologies present new interactive ways to convey information on the Web. The new and enhanced website design has gradually improved sighted users‟ understanding on the Web content but on the other hand, it creates more obstacles to the visually impaired. The significant technological gap in assistive technology and the Web presents on-going challenges to maintain web accessibility, especially for disabled users. The limitations of current assistive technology to convey non-textual information including text attributes such as bold, underline, and italic from the Web further restrict the visually impaired from acquiring comprehensive understanding of the Web content. This project addresses this issues by investigating the problems faced by the visually impaired when using the current assistive technology. The significance of text attributes to support accessibility and improve understanding of the Web content is also being studied. For this purpose several qualitative and quantitative data collection methods are adopted to test the hypotheses. The project also examines the relationship between multimodal technology using audio and haptic modalities and the mental model generated by the visually impaired while accessing webpage. The findings are then used as a framework to develop a system that synchronizes audio and haptic to read webpages and represents text attributes to visually impaired users is to be develop. From the prototype built, pilot testing and user testing are conducted to evaluate the system. The result and recommendations are shared at the end of project for future enhancement

    Augmenting Graphical User Interfaces with Haptic Assistance for Motion-Impaired Operators

    Get PDF
    Haptic assistance is an emerging field of research that is designed to improve human-computer interaction (HCI) by reducing error rates and targeting times through the use of force feedback. Haptic feedback has previously been investigated to assist motion-impaired computer users, however, limitations such as target distracters have hampered its integration with graphical user interfaces (GUIs). In this paper two new haptic assistive techniques are presented that utilise the 3DOF capabilities of the Phantom Omni. These are referred to as deformable haptic cones and deformable virtual switches. The assistance is designed specifically to enable motion-impaired operators to use existing GUIs more effectively. Experiment 1 investigates the performance benefits of the new haptic techniques when used in conjunction with the densely populated Windows on-screen keyboard (OSK). Experiment 2 utilises the ISO 9241-9 point-and-click task to investigate the effects of target size and shape. The results of the study prove that the newly proposed techniques improve interaction rates and can be integrated with existing software without many of the drawbacks of traditional haptic assistance. Deformable haptic cones and deformable virtual switches were shown to reduce the mean number of missed-clicks by at least 75% and reduce targeting times by at least 25%

    An investigation into virtual objects learning by using haptic interface for visually impaired children

    Get PDF
    Children play, touch, see and listen in order to build the foundation for later learning stage of solving problems and understanding themselves within the world surrounding them. However, visually impaired children have limited opportunities in learning new things compared to normal sighted children who have one of the important senses of a human being. Children gain knowledge through learning, playing, touching, seeing, listening and interacting with things that they are interested in. For visually impaired children, learning is different from normal sighted children in that they cannot go out and play with things without guidance and they are not able to see the picture or video of the things or objects like normal children are. A computer simulated virtual reality environment can provide better opportunities for visually impaired children especially in learning the shapes of new objects. An application utilizing the force feedback technology, i.e. Haptic technology, together with the aid of audio has been developed in this research project. Seven different objects are modelled to create haptic shapes for this application which allows visually impaired users to have a better learning environment and assists them in learning the shapes of different objects and also memorizing the shapes of different objects together with the name. The created application is deployed in a fully equipped computer with a stylus based haptic device and a set of speakers. The new architecture can provide an alternative learning environment for visually impaired children especially in learning the shapes of new objects. Based on the findings of this research, as 79% of the users agreed that virtual reality learning is useful in learning the shapes of new objects, the new architecture creates a significant contribution in a novel research area and assists visually impaired children in continuing their learning process
    • 

    corecore