10,606 research outputs found

    Constructing sonified haptic line graphs for the blind student: first steps

    Get PDF
    Line graphs stand as an established information visualisation and analysis technique taught at various levels of difficulty according to standard Mathematics curricula. It has been argued that blind individuals cannot use line graphs as a visualisation and analytic tool because they currently primarily exist in the visual medium. The research described in this paper aims at making line graphs accessible to blind students through auditory and haptic media. We describe (1) our design space for representing line graphs, (2) the technology we use to develop our prototypes and (3) the insights from our preliminary work

    Computer Entertainment Technologies for the Visually Impaired: An Overview

    Get PDF
    Over the last years, works related to accessible technologies have increased both in number and in quality. This work presents a series of articles which explore different trends in the field of accessible video games for the blind or visually impaired. Reviewed articles are distributed in four categories covering the following subjects: (1) video game design and architecture, (2) video game adaptations, (3) accessible games as learning tools or treatments and (4) navigation and interaction in virtual environments. Current trends in accessible game design are also analysed, and data is presented regarding keyword use and thematic evolution over time. As a conclusion, a relative stagnation in the field of human-computer interaction for the blind is detected. However, as the video game industry is becoming increasingly interested in accessibility, new research opportunities are starting to appear

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Towards Understanding and Developing Virtual Environments to Increase Accessibilities for People with Visual Impairments

    Get PDF
    The primary goal of this research is to investigate the possibilities of utilizing audio feedback to support effective Human-Computer Interaction Virtual Environments (VEs) without visual feedback for people with Visual Impairments. Efforts have been made to apply virtual reality (VR) technology for training and educational applications for diverse population groups, such as children and stroke patients. Those applications had already shown effects of increasing motivations, providing safer training environments and more training opportunities. However, they are all based on visual feedback. With the head related transfer functions (HRTFs), it is possible to design and develop considerably safer, but diversified training environments that might greatly benefit individuals with VI. In order to explore this, I ran three studies sequentially: 1) if/how users could navigate themselves with different types of 3D auditory feedback in the same VE; 2) if users could recognize the distance and direction of a virtual sound source in the virtual environment (VE) effectively; 3) if users could recognize the positions and distinguish the moving directions of 3D sound sources in the VE between the participants with and without VI. The results showed some possibilities of designing effective Human-Computer Interaction methods and some understandings of how the participants with VI experienced the scenarios differently than the participants without VI. Therefore, this research contributed new knowledge on how a visually impaired person interacts with computer interfaces, which can be used to derive guidelines for the design of effective VEs for rehabilitation and exercise

    An investigation into virtual objects learning by using haptic interface for visually impaired children

    Get PDF
    Children play, touch, see and listen in order to build the foundation for later learning stage of solving problems and understanding themselves within the world surrounding them. However, visually impaired children have limited opportunities in learning new things compared to normal sighted children who have one of the important senses of a human being. Children gain knowledge through learning, playing, touching, seeing, listening and interacting with things that they are interested in. For visually impaired children, learning is different from normal sighted children in that they cannot go out and play with things without guidance and they are not able to see the picture or video of the things or objects like normal children are. A computer simulated virtual reality environment can provide better opportunities for visually impaired children especially in learning the shapes of new objects. An application utilizing the force feedback technology, i.e. Haptic technology, together with the aid of audio has been developed in this research project. Seven different objects are modelled to create haptic shapes for this application which allows visually impaired users to have a better learning environment and assists them in learning the shapes of different objects and also memorizing the shapes of different objects together with the name. The created application is deployed in a fully equipped computer with a stylus based haptic device and a set of speakers. The new architecture can provide an alternative learning environment for visually impaired children especially in learning the shapes of new objects. Based on the findings of this research, as 79% of the users agreed that virtual reality learning is useful in learning the shapes of new objects, the new architecture creates a significant contribution in a novel research area and assists visually impaired children in continuing their learning process

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    AUXie: Initial evaluation of a blind-accessible virtual museum tour

    Full text link
    Remotely accessible audio-based virtual tours can offer great utility for blind or vision impaired persons, eliminating the difficulties posed by travel to unfamiliar locations, and allowing truly independent exploration. This paper draws upon sonification techniques used in previous implementations of audio-based 3D environments to develop a prototype of blind-accessible virtual tours specifically tailored to the needs of cultural sites. A navigable 3D world is presented using spatially positioned musical earcons, accompanied by synthesised speech descriptions and navigation aids. The worlds are read from X3D models enhanced with metadata to identify and describe the rooms and exhibits, thus enabling an audio modality for existing 3D worlds and simplifying the tour creation process. The prototype, named AUXie, was evaluated by 11 volunteers with total blindness to establish a proof of concept and identify the problematic aspects of the interface. The positive response obtained confirmed the validity of the approach and yielded valuable insight into how such tours can be further improved. Copyright the author(s) and CHISIG

    Memory for sounds: novel technological solutions for the evaluation of mnestic skills

    Get PDF
    Working memory (WM) plays a crucial role in helping individuals to perform everyday activities. The neural structures underlying this system continue to develop during infancy and reach maturity only late in development. Despite useful insights into visual memory mechanisms, audio-spatial memory has not been thoroughly investigated, especially in children and congenitally blind individuals. The main scientific objective of this thesis was to increase knowledge of spatial WM and imagery abilities in the auditory modality. We focused on how these skills change during typical development and on the consequences of early visual deprivation. Our first hypothesis was that the changes in WM functionality and spatial skills occurring in the early years of life, influence the ability to remember and associate spatialized sounds or to explore and learn acoustic spatial layouts. Since vision plays a crucial role in spatial cognition (ThinusBlanc and Gaunet, 1997), we expected blind individuals to encounter specific difficulties when asked to process and manipulate spatial information retained in memory, as already observed in the haptic modality (Cattaneo et al., 2008; Vecchi, 1998). Although some studies demonstrated the superior performance of the blind in various verbal-memory tasks (Amedi et al., 2003; Po\u17e\ue1r, 1982; R\uf6der et al., 2001), very little is known on how they remember and manipulate acoustic spatial information. The investigation of auditory cognition often requires specially adapted hardware and software solutions rarely available on the market. For example, in the case of studying cognitive functions that involve auditory spatial information, multiple acoustic spatial locations are required, such as numerous speakers or dedicated virtual acoustics. Thus, to the aim of this thesis, we took advantage of novel technological solutions developed explicitly for delivering non-visual spatialized stimuli. We worked on the software development of a vertical array of speakers (ARENA2D), an audio-tactile tablet (Audiobrush), and we designed a system based on an acoustic virtual reality (VR) simulation. These novel solutions were used to adapt validated clinical procedures (Corsi-Block test) and games (the card game Memory) to the auditory domain, to be also performed by visually impaired individuals. Thanks to the technologies developed in these years, we could investigate these topics and observed that audio-spatial memory abilities are strongly affected by the developmental stage and the lack of visual experience
    corecore