906 research outputs found

    Feeling what you hear: tactile feedback for navigation of audio graphs

    Get PDF
    Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical data, but the current methods of accessing them are highly visually mediated. Representing data using audio feedback is a common method of making data more accessible, but methods of navigating and accessing the data are often serial in nature and laborious. Tactile or haptic displays could be used to provide additional feedback to support a point-and-click type interaction for the visually impaired. A requirements capture conducted with sight impaired computer users produced a review of current accessibility technologies, and guidelines were extracted for using tactile feedback to aid navigation. The results of a qualitative evaluation with a prototype interface are also presented. Providing an absolute position input device and tactile feedback allowed the users to explore the graph using tactile and proprioceptive cues in a manner analogous to point-and-click techniques

    Evaluation of the Accessibility of Touchscreens for Individuals who are Blind or have Low Vision: Where to go from here

    Get PDF
    Touchscreen devices are well integrated into daily life and can be found in both personal and public spaces, but the inclusion of accessible features and interfaces continues to lag behind technology’s exponential advancement. This thesis aims to explore the experiences of individuals who are blind or have low vision (BLV) while interacting with non-tactile touchscreens, such as smartphones, tablets, smartwatches, coffee machines, smart home devices, kiosks, ATM machines, and more. The goal of this research is to create a set of recommended guidelines that can be used in designing and developing either personal devices or shared public technologies with accessible touchscreens. This study consists of three phases, the first being an exploration of existing research related to accessibility of non-tactile touchscreens, followed by semi-structured interviews of 20 BLV individuals to address accessibility gaps in previous work, and finally a survey in order to get a better understanding of the experiences, thoughts, and barriers for BLV individuals while interacting with touchscreen devices. Some of the common themes found include: loss of independence, lack or uncertainty of accessibility features, and the need and desire for improvements. Common approaches for interaction were: the use of high markings, asking for sighted assistance, and avoiding touchscreen devices. These findings were used to create a set of recommended guidelines which include a universal feature setup, the setup of accessibility settings, universal headphone jack position, tactile feedback, ask for help button, situational lighting, and the consideration of time

    Designing usable mobile interfaces for spatial data

    Get PDF
    2010 - 2011This dissertation deals mainly with the discipline of Human-­‐Computer Interaction (HCI), with particular attention on the role that it plays in the domain of modern mobile devices. Mobile devices today offer a crucial support to a plethora of daily activities for nearly everyone. Ranging from checking business mails while traveling, to accessing social networks while in a mall, to carrying out business transactions while out of office, to using all kinds of online public services, mobile devices play the important role to connect people while physically apart. Modern mobile interfaces are therefore expected to improve the user's interaction experience with the surrounding environment and offer different adaptive views of the real world. The goal of this thesis is to enhance the usability of mobile interfaces for spatial data. Spatial data are particular data in which the spatial component plays an important role in clarifying the meaning of the data themselves. Nowadays, this kind of data is totally widespread in mobile applications. Spatial data are present in games, map applications, mobile community applications and office automations. In order to enhance the usability of spatial data interfaces, my research investigates on two major issues: 1. Enhancing the visualization of spatial data on small screens 2. Enhancing the text-­‐input methods I selected the Design Science Research approach to investigate the above research questions. The idea underling this approach is “you build artifact to learn from it”, in other words researchers clarify what is new in their design. The new knowledge carried out from the artifact will be presented in form of interaction design patterns in order to support developers in dealing with issues of mobile interfaces. The thesis is organized as follows. Initially I present the broader context, the research questions and the approaches I used to investigate them. Then the results are split into two main parts. In the first part I present the visualization technique called Framy. The technique is designed to support users in visualizing geographical data on mobile map applications. I also introduce a multimodal extension of Framy obtained by adding sounds and vibrations. After that I present the process that turned the multimodal interface into a means to allow visually impaired users to interact with Framy. Some projects involving the design principles of Framy are shown in order to demonstrate the adaptability of the technique in different contexts. The second part concerns the issue related to text-­‐input methods. In particular I focus on the work done in the area of virtual keyboards for mobile devices. A new kind of virtual keyboard called TaS provides users with an input system more efficient and effective than the traditional QWERTY keyboard. Finally, in the last chapter, the knowledge acquired is formalized in form of interaction design patterns. [edited by author]X n.s

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    PAPIERCRAFT: A PAPER-BASED INTERFACE TO SUPPORT INTERACTION WITH DIGITAL DOCUMENTS

    Get PDF
    Many researchers extensively interact with documents using both computers and paper printouts, which provide an opposite set of supports. Paper is comfortable to read from and write on, and it is flexible to be arranged in space; computers provide an efficient way to archive, transfer, search, and edit information. However, due to the gap between the two media, it is difficult to seamlessly integrate them together to optimize the user's experience of document interaction. Existing solutions either sacrifice inherent paper flexibility or support very limited digital functionality on paper. In response, we have proposed PapierCraft, a novel paper-based interface that supports rich digital facilities on paper without sacrificing paper's flexibility. By employing the emerging digital pen technique and multimodal pen-top feedback, PapierCraft allows people to use a digital pen to draw gesture marks on a printout, which are captured, interpreted, and applied to the corresponding digital copy. Conceptually, the pen and the paper form a paper-based computer, able to interact with other paper sheets and computing devices for operations like copy/paste, hyperlinking, and web searches. Furthermore, it retains the full range of paper advantages through the light-weighted, pen-paper-only interface. By combining the advantages of paper and digital media and by supporting the smooth transition between them, PapierCraft bridges the paper-computer gap. The contributions of this dissertation focus on four respects. First, to accommodate the static nature of paper, we proposed a pen-gesture command system that does not rely on screen-rendered feedback, but rather on the self-explanatory pen ink left on the paper. Second, for more interactive tasks, such as searching for keywords on paper, we explored pen-top multimodal (e.g. auditory, visual, and tactile) feedback that enhances the command system without sacrificing the inherent paper flexibility. Third, we designed and implemented a multi-tier distributed infrastructure to map pen-paper interactions to digital operations and to unify document interaction on paper and on computers. Finally, we systematically evaluated PapierCraft through three lab experiments and two application deployments in the areas of field biology and e-learning. Our research has demonstrated the feasibility, usability, and potential applications of the paper-based interface, shedding light on the design of the future interface for digital document interaction. More generally, our research also contributes to ubiquitous computing, mobile interfaces, and pen-computing

    Taux : a system for evaluating sound feedback in navigational tasks

    Get PDF
    This thesis presents the design and development of an evaluation system for generating audio displays that provide feedback to persons performing navigation tasks. It first develops the need for such a system by describing existing wayfinding solutions, investigating new electronic location-based methods that have the potential of changing these solutions and examining research conducted on relevant audio information representation techniques. An evaluation system that supports the manipulation of two basic classes of audio display is then described. Based on prior work on wayfinding with audio display, research questions are developed that investigate the viability of different audio displays. These are used to generate hypotheses and develop an experiment which evaluates four variations of audio display for wayfinding. Questions are also formulated that evaluate a baseline condition that utilizes visual feedback. An experiment which tests these hypotheses on sighted users is then described. Results from the experiment suggest that spatial audio combined with spoken hints is the best approach of the approaches comparing spatial audio. The test experiment results also suggest that muting a varying audio signal when a subject is on course did not improve performance. The system and method are then refined. A second experiment is conducted with improved displays and an improved experiment methodology. After adding blindfolds for sighted subjects and increasing the difficulty of navigation tasks by reducing the arrival radius, similar comparisons were observed. Overall, the two experiments demonstrate the viability of the prototyping tool for testing and refining multiple different audio display combinations for navigational tasks. The detailed contributions of this work and future research opportunities conclude this thesis

    Tablet PCs in schools: case study report

    Get PDF

    Teaching Visually Impaired College Students in Introductory Statistics

    Get PDF
    Instructors of postsecondary classes in statistics rely heavily on visuals in their teaching, both within the classroom and in resources like textbooks, handouts, and software, but this information is often inaccessible to students who are blind or visually impaired (BVI). The unique challenges involved in adapting both pedagogy and course materials to accommodate a BVI student may provoke anxiety among instructors teaching a BVI student for the first time, and instructors may end up feeling unprepared or “reinventing the wheel.” We discuss a wide variety of accommodations inside and outside of the classroom grounded in the empirical literature on cognition and learning and informed by our own experience teaching a blind student in an introductory statistics course

    New knowledge and methods for mitigating driver distraction

    Get PDF
    Driver distraction is the diversion of attention to a non-driving related activity. It has been identified as major cause of accidents. Even as we move away from traditional ‘driver’ and towards highly-automated vehicles, distraction remains an important issue. A distracted driver could still potentially miss a handover of control message from the car, or have a reduced awareness of the traffic environment. With the increased number and complexity of new features being introduced in vehicles, it is becoming more important to understand how drivers interact with them, to understand the benefit they offer in helping the driver to focus on-road, but also to identify their limitations and risks. Thereby it is important to consider that the interaction between human and technology, e.g. driver distraction, can be described by many aspects. To learn the most about the interaction between user and technology, it is important to select a suitable measure and to utilise that measure in best practice, which can be hard to find in literature. This research project is divided into two research streams that investigate the opportunities of new in-vehicle interfaces to mitigate driver distraction and that research how to efficiently identify measures for the ergonomic evaluation of in-vehicle interfaces. Research stream one, comprising four studies, evaluated tactile information as a new interface technology to mitigate distraction in manual and automated cars. Tactile perception requires physical contact between the driver and the device delivering the feedback. It can be decreased by clothing. In the first user trial it was evaluated, for the first time, how shoe type, gender, and age influence the driver’s perception of a tactile pedal. Shoe type did not, but gender, age, and the feedback’s duration and amplitude did influence the perception. In some durations and amplitudes, the feedback was recognised by all participants and was rated highly intense, both aspects a warning should have. Next, it was evaluated how fast people would react to a tactile warning compared to a traditional auditory warning and an auditory-tactile warning. The participants reacted significantly slower to the tactile warning. Following, a tactile warning might not be suitable as an in-vehicle warning. However, adding an auditory component to the tactile warning increases its efficiency and people missed less auditory-tactile compared to auditory warnings. Newly introduced interfaces, such as tactile interfaces, put an effort on drivers to adjust to them and might lead to unsafe interactions. In the third and fourth study, it was investigated how a driver’s trust effects the reaction time and glance behaviour. Trust was not associated with the reaction time towards a tactile warning signal, but it influenced the glances at a voice-navigation interface that was new for the majority of the participants. The findings can be utilised to increase the trust in the interface dialogue and thereby decrease a driver’s time glanced off-road. Research stream two investigated how Human-Machine-Interface (HMI) engineers can be supported in the comparison and selection of measures (e.g. a usability score) to evaluate the ergonomics of in-vehicle devices, for example to measure driver distraction. Industry projects are often restricted by tight deadlines and limited availability of equipment. Measure selection can then become a time critical issue. In published literature, there existed no guidelines to support this task. In four rapid prototyping evaluations, an interface was developed that can aid HMI-engineers in the comparison and selection of measures for an ergonomic evaluation. The tool functions as knowledge management and foresees to inform users about the best practice to utilise a measure, tips to set-up required equipment, and templates for the measure, for example templates for the analysis or electronic versions of questionnaires
    • 

    corecore