7 research outputs found

    AudioFunctions.web: Multimodal Exploration of Mathematical Function Graphs

    Get PDF
    We present AudioFunctions.web, a web app that uses sonifcation, earcons and speech synthesis to enable blind people to explore mathematical function graphs. The system is designed for personalized access through different interfaces (touchscreen, keyboard, touchpad and mouse) on both mobile and traditional devices, in order to better adapt to different user abilities and preferences. It is also publicly available as a web service and can be directly accessed from the teaching material through a hypertext link. An experimental evaluation with 13 visually impaired participants highlights that, while the usability of all the presented interaction modalities is high, users with different abilities prefer different interfaces to interact with the system. It is also shown that users with higher level of mathematical education are capable of better adapting to interaction modalities considered more diffcult by others

    Increasing STEM Accessibility in Students with Print Disabilities through MathSpeak

    Get PDF
    Individuals with print disabilities have difficulty processing information through visual means and rely heavily on spoken input. Mathematics and fields that have a heavy emphasis on mathematics are difficult for these individuals because of ambiguity inherent in typical everyday spoken renderings of mathematical expressions. MathSpeak is a set of rules for speaking mathematical expressions in a non-ambiguous manner. The present study tested the efficacy of MathSpeak rules for disambiguation of auditory renderings of spoken mathematics. Findings suggest that MathSpeak is efficacious for disambiguating spoken mathematics

    Accessible Mathematics on Touchscreen Devices: New Opportunities for People with Visual Impairments

    Get PDF
    In recent years educational applications for touchscreen devices (e.g., tablets) become widespread all over the world. While these devices are accessible to people with visual impairments, educational applications to support learning of STEM subjects are often not accessible to visually impaired people due to inaccessible graphics. This contribution addresses the problem of conveying graphics to visual impaired users. Two approaches are taken into account: audio icons and image sonification. In order to evaluate the applicability of these approaches, we report our experience in the development of two didactic applications for touchscreen devices, specifically designed to support people with visual impairments or blindness while studying STEM subjects: Math Melodies and Audio Functions. The former is a commercial application to support children in primary school in an inclusive class. It adopts an interaction paradigm based on audio icons. The latter is a prototype application aimed at enabling visually impaired students to explore function diagrams and adopts an image sonification approach

    ASSISTIVE TECHNOLOGIES ON MOBILE DEVICES FOR PEOPLE WITH VISUAL IMPAIRMENTS

    Get PDF
    Spatial understanding and cognitive mapping are challenging tasks for people with visual impairments. The goal of this work is to leverage computer vision and spatial understanding techniques along with audio-haptic proprioceptive interaction paradigms for assisting people with visual impairments in spatial comprehension and memorization. Abstract space exploration in the field of assistive didactics is tackled through tactile exploration and audio feedback resulting in two solutions. The first one focuses on math learning in primary education while the second one focuses on function graph tactile exploration and sonification. In the field of spatial comprehension during way-finding for people with visual impairments, computer vision and spatial reasoning techniques are used for detecting visual cues such as zebra pedestrian crossings and for safely guiding the user with respect to the detected elements. Suitable interaction paradigms based on sonification and haptic feedback are designed to assist the user efficiently and quickly during the navigation

    MOBILE ASSISTIVE TECHNOLOGIES FOR PEOPLE WITH VISUAL IMPAIRMENT: SENSING AND CONVEYING INFORMATION TO SUPPORT ORIENTATION, MOBILITY AND ACCESS TO IMAGES

    Get PDF
    Smartphones are accessible to persons with visual impairment or blindness (VIB): screen reader technologies, integrated with mobile operating systems, enable non-visual interaction with the device. Also, features like GPS receivers, inertial sensors and cameras enable the development of Mobile Assistive Technologies (MATs) to support people with VIB. A preliminary analysis, conducted adopting an user-centric approach, highlighted some issues experienced by people with VIB in everyday activities from three main fields: orientation, mobility and access to images. Traditional approaches to address these issues, based on assistive tools and technologies, have some limitations: in the field of mobility, for example, existing navigation support solutions (e.g. the white cane) cannot be used to perceive some environmental features like crosswalks or the current state of traffic lights; in the field of orientation, tactile maps adopted to develop cognitive maps of the environment are limited in the amount of information that can be represented on a single surface and by the lack of interactivity, two issues experienced also in other fields where access to graphical information is of paramount importance like, for example, didactics of STEM subjects. This work presents new MATs that deal with these limitations by introducing novel solutions in different fields of Computer Science. Original computer vision techniques, designed to detect the presence of pedestrian crossings and the state of traffic lights, are used to sense information from the environment and support mobility of people with VIB. Novel sonification techniques are introduced to efficiently convey information with three different goals: first, to convey guidance information in urban crossings; second, to enhance the development of cognitive maps by augmenting tactile surfaces; third, to enable quick access to images. Experience reported in this dissertation shows that the proposed MATs are effective in supporting people with VIB and, in general, that mobile devices are a versatile platform to enable affordable and pervasive access to assistive technologies. Involving target users in the evaluation of MATs emerged as a major challenge in this work. However, it is shown how such challenge can be addressed by adopting large scale evaluation techniques typical of HCI research

    Multimodales kollaboratives Zeichensystem für blinde Benutzer

    Get PDF
    Bilder und grafische Darstellungen gehören heutzutage zu den gängigen Kommunikationsmitteln und Möglichkeiten des Informationsaustauschs sowie der Wissensvermittlung. Das bildliche Medium kann allerdings, wenn es rein visuell präsentiert wird, ganze Nutzergruppen ausschließen. Blinde Menschen benötigen beispielsweise Alternativtexte oder taktile Darstellungen, um Zugang zu grafischen Informationen erhalten zu können. Diese müssen jedoch an die speziellen Bedürfnisse von blinden und hochgradig sehbehinderten Menschen angepasst sein. Eine Übertragung von visuellen Grafiken in eine taktile Darstellung erfolgt meist durch sehende Grafikautoren und -autorinnen, die teilweise nur wenig Erfahrung auf dem Gebiet der taktilen Grafiktranskription haben. Die alleinige Anwendung von Kriterienkatalogen und Richtlinien über die Umsetzung guter taktiler Grafiken scheint dabei nicht ausreichend zu sein, um qualitativ hochwertige und gut verständliche grafisch-taktile Materialien bereitzustellen. Die direkte Einbeziehung einer sehbehinderten Person in den Transkriptionsprozess soll diese Problematik angehen, um Verständnis- und Qualitätsproblemen vorzubeugen. Großflächige dynamisch taktile Displays können einen nicht-visuellen Zugang zu Grafiken ermöglichen. Es lassen sich so auch dynamische Veränderungen an Grafiken vermitteln. Im Rahmen der vorliegenden Arbeit wurde ein kollaborativer Zeichenarbeitsplatz für taktile Grafiken entwickelt, welcher es unter Einsatz eines taktilen Flächendisplays und auditiver Ausgaben ermöglicht, eine blinde Person aktiv als Lektorin bzw. Lektor in den Entstehungsprozess einer Grafik einzubinden. Eine durchgeführte Evaluation zeigt, dass insbesondere unerfahrene sehende Personen von den Erfahrungen sehbehinderter Menschen im Umgang mit taktilen Medien profitieren können. Im Gegenzug lassen sich mit dem kollaborativen Arbeitsplatz ebenso unerfahrene sehbehinderte Personen im Umgang mit taktilen Darstellungen schulen. Neben Möglichkeiten zum Betrachten und kollaborativen Bearbeiten werden durch den zugänglichen Zeichenarbeitsplatz auch vier verschiedene Modalitäten zur Erzeugung von Formen angeboten: Formenpaletten als Text-Menüs, Gesteneingaben, Freihandzeichnen mittels drahtlosem Digitalisierungsstift und das kamerabasierte Scannen von Objektkonturen. In einer Evaluation konnte gezeigt werden, dass es mit diesen Methoden auch unerfahrenen blinden Menschen möglich ist, selbständig Zeichnungen in guter Qualität zu erstellen. Dabei präferieren sie jedoch robuste und verlässliche Eingabemethoden, wie Text-Menüs, gegenüber Modalitäten, die ein gewisses Maß an Können und Übung voraussetzen oder einen zusätzlichen technisch aufwendigen Aufbau benötigen.Pictures and graphical data are common communication media for conveying information and know\-ledge. However, these media might exclude large user groups, for instance visually impaired people, if they are offered in visual form only. Textual descriptions as well as tactile graphics may offer access to graphical information but have to be adapted to the special needs of visually impaired and blind readers. The translation from visual into tactile graphics is usually implemented by sighted graphic authors, some of whom have little experience in creating proper tactile graphics. Applying only recommendations and best practices for preparing tactile graphics does not seem sufficient to provide intelligible, high-quality tactile materials. Including a visually impaired person in the process of creating a tactile graphic should prevent such quality and intelligibility issues. Large dynamic tactile displays offer non-visual access to graphics; even dynamic changes can be conveyed. As part of this thesis, a collaborative drawing workstation was developed. This workstation utilizes a tactile display as well as auditory output to actively involve a blind person as a lector in the drawing process. The evaluation demonstrates that inexperienced sighted graphic authors, in particular, can be\-ne\-fit from the knowledge of a blind person who is accustomed to handling tactile media. Furthermore, inexperienced visually impaired people may be trained in reading tactile graphics with the help of the collaborative drawing workstation. In addition to exploring and manipulating existing graphics, the accessible drawing workstation offers four different modalities to create tactile shapes: text-based shape-palette-menus, gestural drawing, freehand drawings using a wireless stylus and scanning object silhouettes by a ToF-camera. The evaluation confirms that even untrained blind users can create drawings in good quality by using the accessible drawing workstation. However, users seem to prefer robust, reliable modalities for drawing, such as text menus, over modalities which require a certain level of skill or additional technical effort
    corecore