247 research outputs found

    User Evaluation of the Smartphone Screen Reader VoiceOver with Visually Disabled Participants

    Get PDF
    Touchscreen assistive technology is designed to support speech interaction between visually disabled people and mobile devices, allowing hand gestures to interact with a touch user interface. In a global perspective, the World Health Organisation estimates that around 285 million people are visually disabled with 2/3 of them over 50 years old. This paper presents the user evaluation of VoiceOver, a built-in screen reader in Apple Inc. products, with a detailed analysis of the gesture interaction, familiarity and training by visually disabled users and the system response. Six participants with prescribed visual disability took part in the tests in a usability laboratory under controlled conditions. Data were collected and analysed using a mixed methods approach, with quantitative and qualitative measures. The results showed that the participants found most of the hand gestures easy to perform, although they reported inconsistent responses and lack of information associated to several functionalities. User training on each gesture was reported as key to allow participants to perform certain difficult or unknown gestures. The paper also reports on how to perform mobile device user evaluations in a laboratory environment and provides recommendations on technical and physical infrastructure.User Evaluation of the Smartphone Screen Reader VoiceOver with Visually Disabled ParticipantspublishedVersio

    Recommendations on a Test Infrastructure for Evaluation of Touchscreen Assistive Technology for Visually Impaired Users

    Get PDF
    Published version of a paper from the 13th Scandinavian Conference on Health Informatics, TromsþMobile technologies’ touchscreen allows the use of choreography of gestures to interact with the user interface. Relevant aspects in mobile technology design become crucial when targeting users with disabilities. For instance, when assistive technology is designed to support speech interaction between visually impaired users and a system, accessibility and ease-of-use of such technology should be included in the usability and technical evaluation of their effectiveness. This paper presents the analysis of the technical and physical infrastructure of a controlled laboratory environment for user evaluations made in the research project “Visually impaired users touching the screen - A user evaluation of assistive technology” where VoiceOver, a screen reader in Apple Inc. products was tested. The paper reports on challenges related to the use of the test infrastructure, such as how to obtain valuable data when interactive high-speed gestures are performed and how to optimise the recording and syn-chronisation between audio and video data. The lessons learned by the research group showed that there are effective alternatives for each challenge, and these should be customised for each particular test, type of participants and device

    Mobile recommender apps with privacy management for accessible and usable technologies

    Get PDF
    The paper presents the preliminary results of an ongoing survey of the use of computers and mobile devices, interest in recommender apps and knowledge and concerns about privacy issues amongst English and Italian speaking disabled people. Participants were found to be regular users of computers and mobile devices for a range of applications. They were interested in recommender apps for household items, computer software and apps that met their accessibility and other requirements. They showed greater concerns about controlling access to personal data of different types than this data being retained by the computer or mobile device. They were also willing to make tradeoffs to improve device performance

    Accessibility of Mobile Devices for Visually Impaired Users: An Evaluation of the Screen-reader VoiceOver

    Get PDF
    A mobile device's touchscreen allows users to use a choreography of hand gestures to interact with the user interface. A screen reader on a mobile device is designed to support the interaction of visually disabled users while using gestures. This paper presents an evaluation of VoiceOver, a screen reader in Apple Inc. products. The evaluation was a part of the research project "Visually impaired users touching the screen - a user evaluation of assistive technology".publishedVersionnivÄ

    Video conferencing tools: comparative study of the experiences of screen reader users and the development of more inclusive design guidelines

    Get PDF
    Since the first lockdown in 2020, video conferencing tools have become increasingly important for employment, education, and social interaction, making them essential tools in everyday life. This study investigates the accessibility and usability of the desktop and mobile versions of three popular video conferencing tools, Zoom, Google Meet and MS Teams, for visually impaired people interacting via screen readers and keyboard or gestures. This involved two inspection evaluations to test the most important features of the desktop and mobile device versions and two surveys of visually impaired users to obtain information about the accessibility of the selected video conferencing tools. 65 and 94 people answered the surveys for desktop and mobile platforms respectively. The results showed that Zoom was preferred to Google Meet and MS Teams, but that none of the tools was fully accessible via screen reader and keyboard or gestures. Finally, the results of this empirical study were used to develop a set of guidelines for designers of video conferencing tools and assistive technology

    Making Spatial Information Accessible on Touchscreens for Users who are Blind and Visually Impaired

    Get PDF
    Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing their usage of smartphones and touchscreens. Although basic access is available, there are still many accessibility issues left to deal with in order to bring full inclusion to this population. One of the important challenges lies in accessing and creating of spatial information on touchscreens. The work presented here provides three new techniques, using three different modalities, for accessing spatial information on touchscreens. The first system makes geometry and diagram creation accessible on a touchscreen through the use of text-to-speech and gestural input. This first study is informed by a qualitative study of how people who are blind and visually impaired currently access and create graphs and diagrams. The second system makes directions through maps accessible using multiple vibration sensors without any sound or visual output. The third system investigates the use of binaural sound on a touchscreen to make various types of applications accessible such as physics simulations, astronomy, and video games

    Designing assistive technology for getting more independence for blind people when performing everyday tasks: an auditory-based tool as a case study

    Get PDF
    Everyday activities and tasks should in theory be easily carried by everyone, including the blind. Information and Communication Technology (ICT) has been widely used for supporting solutions. However, the solutions can be problematic for the visually impaired since familiarity with digital devices is often required. Or, indeed the procedure can be perceived as fiddly or impractical particularly for repetitive tasks due to the number/type of steps required to complete the task. This paper introduces a simple audio-based tool aimed at supporting visually-impaired people in the seemingly simple activity of checking whether the light in a room is on or off. It is an example of potential low tech devices that can be designed without the need for specific skills or knowledge by the user, and that functions in a practical way. In this context, we discuss the main issues and considerations for totally blind users in identifying whether a light is switched on. The proposed prototype is based on a simple circuit and a form of auditory feedback which informs the user whether they are switching on or off the light. Two prototypes have been designed and built for two different kinds of installation. For the subsequent second prototype, three different versions are proposed to provide a blind person with further support in easily identifying the light status at home. The new design includes enhanced auditory feedback and modifications to the dimensions. The evaluation conducted by involving various groups of end-users revealed the usefulness of the proposed tool. In addition, a survey conducted with 100 visually-impaired people reported the limitations and difficulties encountered by the blind in using existing devices. Moreover, the study revealed the interest from 94% of the participants for a potential (new) basic tool integrable with the existing lighting system. This study gives a contribution in the ambient intelligence field by (1) showing how an auditory-based tool can be used to support totally blind people to check the lights in an autonomous and relatively simple way; (2) proposing an idea that can be exploited in other application cases that use light feedback; and (3) proposing seven potential recommendations for designing assistive technology tools and common everyday devices, based on information gathered from the online survey

    Concurrent speech feedback for blind people on touchscreens

    Get PDF
    Tese de Mestrado, Engenharia InformĂĄtica, 2023, Universidade de Lisboa, Faculdade de CiĂȘnciasSmartphone interactions are demanding. Most smartphones come with limited physical buttons, so users can not rely on touch to guide them. Smartphones come with built-in accessibility mechanisms, for example, screen readers, that make the interaction accessible for blind users. However, some tasks are still inefficient or cumbersome. Namely, when scanning through a document, users are limited by the single sequential audio channel provided by screen readers. Or when tasks are interrupted in the presence of other actions. In this work, we explored alternatives to optimize smartphone interaction by blind people by leveraging simultaneous audio feedback with different configurations, such as different voices and spatialization. We researched 5 scenarios: Task interruption, where we use concurrent speech to reproduce a notification without interrupting the current task; Faster information consumption, where we leverage concurrent speech to announce up to 4 different contents simultaneously; Text properties, where the textual formatting is announced; The map scenario, where spatialization provides feedback on how close or distant a user is from a particular location; And smartphone interactions scenario, where there is a corresponding sound for each gesture, and instead of reading the screen elements (e.g., button), a corresponding sound is played. We conducted a study with 10 blind participants whose smartphone usage experience ranges from novice to expert. During the study, we asked participants’ perceptions and preferences for each scenario, what could be improved, and in what situations these extra capabilities are valuable to them. Our results suggest that these extra capabilities we presented are helpful for users, especially if these can be turned on and off according to the user’s needs and situation. Moreover, we find that using concurrent speech works best when announcing short messages to the user while listening to longer content and not so much to have lengthy content announced simultaneously
    • 

    corecore