668 research outputs found

    Review of Machine Vision-Based Electronic Travel Aids

    Get PDF
    Visual impaired people have navigation and mobility problems on the road. Up to now, many approaches have been conducted to help them navigate around using different sensing techniques. This paper reviews several machine vision- based Electronic Travel Aids (ETAs) and compares them with those using other sensing techniques. The functionalities of machine vision-based ETAs are classified from low-level image processing such as detecting the road regions and obstacles to high-level functionalities such as recognizing the digital tags and texts. In addition, the characteristics of the ETA systems for blind people are particularly discussed

    A Formal Approach to Computer Aided 2D Graphical Design for Blind People

    Get PDF
    The growth of computer aided drawing systems for blind people (CADB) has long been recognised and has increased in interest within the assistive technology research area. The representation of pictorial data by blind and visually impaired (BVI) people has recently gathered momentum with research and development; however, a survey of published literature on CADB reveals that only marginal research has been focused on the use of a formal approach for on screen spatial orientation, creation and reuse of graphics artefacts. To realise the full potential of CADB, such systems should possess attributes of usability, spatial navigation and shape creation features without which blind users drawing activities are less likely to be achieved. As a result of this, usable, effective and self-reliant CADB have arisen from new assistive Technology (AT) research. This thesis contributes a novel, abstract, formal approach that facilitates BVI users to navigate on the screen, create computer graphics/diagrams using 2D shapes and user-defined images. Moreover, the research addresses the specific issues involved with user language by formulating specific rules that make BVI user interaction with the drawing effective and easier. The formal approach proposed here is descriptive and it is specified at a level of abstraction above the concrete level of system technologies. The proposed approach is unique in problem modelling and syntheses of an abstract computer-based graphics/drawings using a formal set of user interaction commands. This technology has been applied to enable blind users to independently construct drawings to satisfy their specific needs without recourse to a specific technology and without the intervention of support workers. The specification aims to be the foundation for a system scope, investigation guidelines and user-initiated command-driven interaction. Such an approach will allow system designers and developers to proceed with greater conceptual clarity than it is possible with current technologies that is built on concrete system-driven prototypes. In addition to the scope of the research the proposed model has been verified by various types of blind users who have independently constructed drawings to satisfy their specific needs without the intervention of support workers. The effectiveness and usability of the proposed approach has been compared against conventional non-command driven drawing systems by different types of blind users. The results confirm that the abstract formal approach proposed here using command-driven means in the context of CADB enables greater comprehension by BVI users. The innovation can be used for both educational and training purposes. The research, thereby sustaining the claim that the abstract formal approach taken allows for the greater comprehension of the command-driven means in the context of CADB, and how the specification aid the design of such a system

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    Exploring the Use of Wearables to Enable Indoor Navigation for Blind Users

    Get PDF
    One of the challenges that people with visual impairments (VI) have to have to confront daily, is navigating independently through foreign or unfamiliar spaces.Navigating through unfamiliar spaces without assistance is very time consuming and leads to lower mobility. Especially in the case of indoor environments where the use of GPS is impossible, this task becomes even harder.However, advancements in mobile and wearable computing pave the path to new cheap assistive technologies that can make the lives of people with VI easier.Wearable devices have great potential for assistive applications for users who are blind as they typically feature a camera and support hands and eye free interaction. Smart watches and heads up displays (HUDs), in combination with smartphones, can provide a basis for development of advanced algorithms, capable of providing inexpensive solutions for navigation in indoor spaces. New interfaces are also introduced making the interaction between users who are blind and mo-bile devices more intuitive.This work presents a set of new systems and technologies created to help users with VI navigate indoor environments. The first system presented is an indoor navigation system for people with VI that operates by using sensors found in mo-bile devices and virtual maps of the environment. The second system presented helps users navigate large open spaces with minimum veering. Next a study is conducted to determine the accuracy of pedometry based on different body placements of the accelerometer sensors. Finally, a gesture detection system is introduced that helps communication between the user and mobile devices by using sensors in wearable devices

    Schematisation in Hard-copy Tactile Orientation Maps

    Get PDF
    This dissertation investigates schematisation of computer-generated tactile orientation maps that support mediation of spatial knowledge of unknown urban environments. Computergenerated tactile orientation maps are designed to provide the blind with an overall impression of their surroundings. Their details are displayed by means of elevated features that are created by embossers and can be distinguished by touch. The initial observation of this dissertation states that only very little information is actually transported through tactile maps owing to the coarse resolution of tactual senses and the cognitive effort involved in the serial exploration of tactile maps. However, the differences between computer-generated, embossed tactile maps and manufactured, deep-drawn tactile maps are significant. Therefore the possibilities and confines of communicating information through tactile maps produced with embossers is a primary area of research. This dissertation has been able to demonstrate that the quality of embossed prints is an almost equal alternative to traditionally manufactured deep-drawn maps. Their great advantage is fast and individual production and (apart from the initial procurement costs for the printer)low price, accessibility and easy understanding without the need of prior time-consuming training. Simplification of tactile maps is essential, even more so than in other maps. It can be achieved by selecting a limited number from all map elements available. Qualitative simplification through schematisation may present an additional option to simplification through quantitative selection. In this context schematisation is understood as cognitively motivated simplification of geometry and synchronised maintenance of topology. Rather than further reducing the number of displayed objects, the investigation concentrates on how the presentation of different forms of streets (natural vs. straightened) and junctions (natural vs. prototypical) affects the transfer of knowledge. In a second area of research, a thesis establishes that qualitative simplification of tactile orientation maps through schematisation can enhance their usability and make them easier to understand than maps that have not been schematised. The dissertation shows that simplifying street forms and limiting them to prototypical junctions does not only accelerate map exploration but also has a beneficial influence on retention performance. The majority of participants that took part in the investigation selected a combination of both as their preferred display option. Tactile maps that have to be tediously explored through touch, uncovering every detail, complicate attaining a first impression or an overall perception. A third area of research is examined, establishing which means could facilitate map readersĂą options to discover certain objects on the map quickly and without possessing a complete overview. Three types of aids are examined: guiding lines leading from the frame of the map to the object, position indicators represented by position markers at the frame of the map and coordinate specifications found within a grid on the map. The dissertation shows that all three varieties can be realised by embossers. Although a guiding line proves to be fast in size A4 tactile maps containing only one target object and few distracting objects, it also impedes further exploration of the map (similar to the grid). In the following, advantages and drawbacks of the various aids in this and other applications are discussed. In conclusion the dissertation elaborates on the linking points of all three examinations. They connect and it is argued that cognitively motivated simplification should be a principle of construction for embossed tactile orientation maps in order to support their use and comprehension. A summary establishes the recommendations that result from this dissertation regarding construction of tactile orientation maps considering the limitations through embosser constraints. Then I deliberate how to adapt schematisation of other maps contingent to intended function, previous knowledge of the map reader, and the relation between the time in which knowledge is acquired and the time it is employed. Closing the dissertation, I provide an insight into its confines and deductions and finish with a prospective view to possible transfers of the findings to other applications, e.g. multimedia or interactive maps on pin-matrix displays and devices

    Teaching Learners with Visual Impairment

    Get PDF
    This book, Teaching Learners with Visual Impairment, focuses on holistic support to learners with visual impairment in and beyond the classroom and school context. Special attention is given to classroom practice, learning support, curriculum differentiation and assessment practices, to mention but a few areas of focus covered in the book. In this manner, this book makes a significant contribution to the existing body of knowledge on the implementation of inclusive education policy with learners affected by visual impairment

    HapticHead - Augmenting Reality via Tactile Cues

    Get PDF
    Information overload is increasingly becoming a challenge in today's world. Humans have only a limited amount of attention to allocate between sensory channels and tend to miss or misjudge critical sensory information when multiple activities are going on at the same time. For example, people may miss the sound of an approaching car when walking across the street while looking at their smartphones. Some sensory channels may also be impaired due to congenital or acquired conditions. Among sensory channels, touch is often experienced as obtrusive, especially when it occurs unexpectedly. Since tactile actuators can simulate touch, targeted tactile stimuli can provide users of virtual reality and augmented reality environments with important information for navigation, guidance, alerts, and notifications. In this dissertation, a tactile user interface around the head is presented to relieve or replace a potentially impaired visual channel, called \emph{HapticHead}. It is a high-resolution, omnidirectional, vibrotactile display that presents general, 3D directional, and distance information through dynamic tactile patterns. The head is well suited for tactile feedback because it is sensitive to mechanical stimuli and provides a large spherical surface area that enables the display of precise 3D information and allows the user to intuitively rotate the head in the direction of a stimulus based on natural mapping. Basic research on tactile perception on the head and studies on various use cases of head-based tactile feedback are presented in this thesis. Several investigations and user studies have been conducted on (a) the funneling illusion and localization accuracy of tactile stimuli around the head, (b) the ability of people to discriminate between different tactile patterns on the head, (c) approaches to designing tactile patterns for complex arrays of actuators, (d) increasing the immersion and presence level of virtual reality applications, and (e) assisting people with visual impairments in guidance and micro-navigation. In summary, tactile feedback around the head was found to be highly valuable as an additional information channel in various application scenarios. Most notable is the navigation of visually impaired individuals through a micro-navigation obstacle course, which is an order of magnitude more accurate than the previous state-of-the-art, which used a tactile belt as a feedback modality. The HapticHead tactile user interface's ability to safely navigate people with visual impairments around obstacles and on stairs with a mean deviation from the optimal path of less than 6~cm may ultimately improve the quality of life for many people with visual impairments.Die InformationsĂŒberlastung wird in der heutigen Welt zunehmend zu einer Herausforderung. Der Mensch hat nur eine begrenzte Menge an Aufmerksamkeit, die er zwischen den SinneskanĂ€len aufteilen kann, und neigt dazu, kritische Sinnesinformationen zu verpassen oder falsch einzuschĂ€tzen, wenn mehrere AktivitĂ€ten gleichzeitig ablaufen. Zum Beispiel können Menschen das GerĂ€usch eines herannahenden Autos ĂŒberhören, wenn sie ĂŒber die Straße gehen und dabei auf ihr Smartphone schauen. Einige SinneskanĂ€le können auch aufgrund von angeborenen oder erworbenen Erkrankungen beeintrĂ€chtigt sein. Unter den SinneskanĂ€len wird BerĂŒhrung oft als aufdringlich empfunden, besonders wenn sie unerwartet auftritt. Da taktile Aktoren BerĂŒhrungen simulieren können, können gezielte taktile Reize den Benutzern von Virtual- und Augmented Reality Anwendungen wichtige Informationen fĂŒr die Navigation, FĂŒhrung, Warnungen und Benachrichtigungen liefern. In dieser Dissertation wird eine taktile BenutzeroberflĂ€che um den Kopf herum prĂ€sentiert, um einen möglicherweise beeintrĂ€chtigten visuellen Kanal zu entlasten oder zu ersetzen, genannt \emph{HapticHead}. Es handelt sich um ein hochauflösendes, omnidirektionales, vibrotaktiles Display, das allgemeine, 3D-Richtungs- und Entfernungsinformationen durch dynamische taktile Muster darstellt. Der Kopf eignet sich gut fĂŒr taktiles Feedback, da er empfindlich auf mechanische Reize reagiert und eine große sphĂ€rische OberflĂ€che bietet, die die Darstellung prĂ€ziser 3D-Informationen ermöglicht und es dem Benutzer erlaubt, den Kopf aufgrund der natĂŒrlichen Zuordnung intuitiv in die Richtung eines Reizes zu drehen. Grundlagenforschung zur taktilen Wahrnehmung am Kopf und Studien zu verschiedenen AnwendungsfĂ€llen von kopfbasiertem taktilem Feedback werden in dieser Arbeit vorgestellt. Mehrere Untersuchungen und Nutzerstudien wurden durchgefĂŒhrt zu (a) der Funneling Illusion und der Lokalisierungsgenauigkeit von taktilen Reizen am Kopf, (b) der FĂ€higkeit von Menschen, zwischen verschiedenen taktilen Mustern am Kopf zu unterscheiden, (c) AnsĂ€tzen zur Gestaltung taktiler Muster fĂŒr komplexe Arrays von Aktoren, (d) der Erhöhung des Immersions- und PrĂ€senzgrades von Virtual-Reality-Anwendungen und (e) der UnterstĂŒtzung von Menschen mit Sehbehinderungen bei der FĂŒhrung und Mikronavigation. Zusammenfassend wurde festgestellt, dass taktiles Feedback um den Kopf herum als zusĂ€tzlicher Informationskanal in verschiedenen Anwendungsszenarien sehr wertvoll ist. Am interessantesten ist die Navigation von sehbehinderten Personen durch einen Mikronavigations-Hindernisparcours, welche um eine GrĂ¶ĂŸenordnung prĂ€ziser ist als der bisherige Stand der Technik, der einen taktilen GĂŒrtel als Feedback-ModalitĂ€t verwendete. Die FĂ€higkeit der taktilen Benutzerschnittstelle HapticHead, Menschen mit Sehbehinderungen mit einer mittleren Abweichung vom optimalen Pfad von weniger als 6~cm sicher um Hindernisse und auf Treppen zu navigieren, kann letztendlich die LebensqualitĂ€t vieler Menschen mit Sehbehinderungen verbessern
    • 

    corecore