1,768 research outputs found

    Radar Like Device Using Sonar Technology for Object Detection and Analysis

    Get PDF
    The system users arduino coupled with an supersonic sensor for detection of target near its surroundings. Sonic wave sensors are not being fully utilized in the current situation and this device can test the possibilities and limitation of using sound waves for detection of objects and analysis. Waves are sent in the environment which on collision with external objects are reflected back, the detection of these reflected waves and their properties are analyzed to detect objects

    Towards a Multimodal Adaptive Lighting System for Visually Impaired Children

    Get PDF
    Visually impaired children often have difficulty with everyday activities like locating items, e.g. favourite toys, and moving safely around the home. It is important to assist them during activities like these because it can promote independence from adults and helps to develop skills. Our demonstration shows our work towards a multimodal sensing and output system that adapts the lighting conditions at home to help visually impaired children with such tasks

    Automatically Adapting Home Lighting to Assist Visually Impaired Children

    Get PDF
    For visually impaired children, activities like finding everyday items, locating favourite toys and moving around the home can be challenging. Assisting them during these activities is important because it promotes independence and encourages them to use and develop their remaining visual function. We describe our work towards a system that adapts the lighting conditions at home to help visually impaired children with everyday tasks. We discuss scenarios that show how they may benefit from adaptive lighting, report on our progress and describe our planned future work and evaluation

    Blindsight: Proximity Sensing and Wireless Haptic Feedback

    Get PDF
    A staggering 98 percent of blind people have experienced head-level accidents with some requiring professional medical assistance. Current technologies such as the white cane and guide dogs come with a significant flaw: the inability to detect objects at head-level. As such, the objective of this project was to create a more effective and affordable means to detect and prevent head-level injuries in blind people. Blindsight uses ultrasonic sensors to detect obstacles and transmits this information to haptic vibration motors which will then alert the user of incoming obstacles

    How much spatial information is lost in the sensory substitution process? Comparing visual, tactile, and auditory approaches

    Get PDF
    Sensory substitution devices (SSDs) can convey visuospatial information through spatialised auditory or tactile stimulation using wearable technology. However, the level of information loss associated with this transformation is unknown. In this study novice users discriminated the location of two objects at 1.2m using devices that transformed a 16x 8 depth map into spatially distributed patterns of light, sound, or touch on the abdomen. Results showed that through active sensing, participants could discriminate the vertical position of objects to a visual angle of 1°, 14°, and 21°, and their distance to 2cm, 8cm, and 29cm using these visual, auditory, and haptic SSDs respectively. Visual SSDs significantly outperformed auditory and tactile SSDs on vertical localisation, whereas for depth perception, all devices significantly differed from one another (visual > auditory > haptic). Our findings highlight the high level of acuity possible for SSDs even with low spatial resolutions (e.g. 16 8) and quantify the level of information loss attributable to this transformation for the SSD user. Finally, we discuss ways of closing this ‘modality gap’ found in SSDs and conclude that this process is best benchmarked against performance with SSDs that return to their primary modality (e.g. visuospatial into visual)

    Auditory Displays and Assistive Technologies: the use of head movements by visually impaired individuals and their implementation in binaural interfaces

    Get PDF
    Visually impaired people rely upon audition for a variety of purposes, among these are the use of sound to identify the position of objects in their surrounding environment. This is limited not just to localising sound emitting objects, but also obstacles and environmental boundaries, thanks to their ability to extract information from reverberation and sound reflections- all of which can contribute to effective and safe navigation, as well as serving a function in certain assistive technologies thanks to the advent of binaural auditory virtual reality. It is known that head movements in the presence of sound elicit changes in the acoustical signals which arrive at each ear, and these changes can improve common auditory localisation problems in headphone-based auditory virtual reality, such as front-to-back reversals. The goal of the work presented here is to investigate whether the visually impaired naturally engage head movement to facilitate auditory perception and to what extent it may be applicable to the design of virtual auditory assistive technology. Three novel experiments are presented; a field study of head movement behaviour during navigation, a questionnaire assessing the self-reported use of head movement in auditory perception by visually impaired individuals (each comparing visually impaired and sighted participants) and an acoustical analysis of inter-aural differences and cross- correlations as a function of head angle and sound source distance. It is found that visually impaired people self-report using head movement for auditory distance perception. This is supported by head movements observed during the field study, whilst the acoustical analysis showed that interaural correlations for sound sources within 5m of the listener were reduced as head angle or distance to sound source were increased, and that interaural differences and correlations in reflected sound were generally lower than that of direct sound. Subsequently, relevant guidelines for designers of assistive auditory virtual reality are proposed

    Virtual Reality as Navigation Tool: Creating Interactive Environments For Individuals With Visual Impairments

    Get PDF
    Research into the creation of assistive technologies is increasingly incorporating the use of virtual reality experiments. One area of application is as an orientation and mobility assistance tool for people with visual impairments. Some of the challenges are developing useful knowledge of the user’s surroundings and effectively conveying that information to the user. This thesis examines the feasibility of using virtual environments conveyed via auditory feedback as part of an autonomous mobility assistance system. Two separate experiments were conducted to study key aspects of a potential system: navigation assistance and map generation. The results of this research include mesh models that were fitted to the walk pathways of an environment, and collected data that provide insights on the viability of virtual reality based guidance systems

    BlueEyes: assistive technology for visually impaired and blind people - a bluetooth

    Get PDF
    This report is presented to draw one solution “people to people” (P2P) through the mobile technology that promotes the change in the field of sustainability in relation to the Application system. The HCI interaction field, as the basis for the study of this project, is defined as a multidisciplinary field of knowledge, focusing on the design of computer technology and, in particular, on the interaction between humans and computers. For the development of this project it was necessary enough research information on the technologies that will be needed to create an application mobile. All this research and design belongs to just one of the various stages of this project that has the base of operations at ESEC

    BlueEyes: assistive technology for visually impaired and blind people - a bluetooth

    Get PDF
    This report is presented to draw one solution “people to people” (P2P) through the mobile technology that promotes the change in the field of sustainability in relation to the Application system. The HCI interaction field, as the basis for the study of this project, is defined as a multidisciplinary field of knowledge, focusing on the design of computer technology and, in particular, on the interaction between humans and computers. For the development of this project it was necessary enough research information on the technologies that will be needed to create an application mobile. All this research and design belongs to just one of the various stages of this project that has the base of operations at ESEC

    A Sound Approach Toward a Mobility Aid for Blind and Low-Vision Individuals

    Get PDF
    Reduced independent mobility of blind and low-vision individuals (BLVIs) cause considerable societal cost, burden on relatives, and reduced quality of life for the individuals, including increased anxiety, depression symptoms, need of assistance, risk of falls, and mortality. Despite the numerous electronic travel aids proposed since at least the 1940’s, along with ever-advancing technology, the mobility issues persist. A substantial reason for this is likely several and severe shortcomings of the field, both in regards to aid design and evaluation.In this work, these shortcomings are addressed with a generic design model called Desire of Use (DoU), which describes the desire of a given user to use an aid for a given activity. It is then applied on mobility of BLVIs (DoU-MoB), to systematically illuminate and structure possibly all related aspects that such an aid needs to aptly deal with, in order for it to become an adequate aid for the objective. These aspects can then both guide user-centered design as well as choice of test methods and measures.One such measure is then demonstrated in the Desire of Use Questionnaire for Mobility of Blind and Low-Vision Individuals (DoUQ-MoB), an aid-agnostic and comprehensive patient-reported outcome measure. The question construction originates from the DoU-MoB to ensure an encompassing focus on mobility of BLVIs, something that has been missing in the field. Since it is aid-agnostic it facilitates aid comparison, which it also actively promotes. To support the reliability of the DoUQ-MoB, it utilizes the best known practices of questionnaire design and has been validated once with eight orientation and mobility professionals, and six BLVIs. Based on this, the questionnaire has also been revised once.To allow for relevant and reproducible methodology, another tool presented herein is a portable virtual reality (VR) system called the Parrot-VR. It uses a hybrid control scheme of absolute rotation by tracking the user’s head in reality, affording intuitive turning; and relative movement where simple button presses on a controller moves the virtual avatar forward and backward, allowing for large-scale traversal while not walking physically. VR provides excellent reproducibility, making various aggregate movement analysis feasible, while it is also inherently safe. Meanwhile, the portability of the system facilitates testing near the participants, substantially increasing the number of potential blind and low-vision recruits for user tests.The thesis also gives a short account on the state of long-term testing in the field; it being short is mainly due to that there is not much to report. It then provides an initial investigation into possible outcome measures for such tests by taking instruments in use by Swedish orientation and mobility professionals as a starting point. Two of these are also piloted in an initial single-session trial with 19 BLVIs, and could plausibly be used for long-term tests after further evaluation.Finally, a discussion is presented regarding the Audomni project — the development of a primary mobility aid for BLVIs. Audomni is a visuo-auditory sensory supplementation device, which aims to take visual information and translate it to sound. A wide field-of-view, 3D-depth camera records the environment, which is then transformed to audio through the sonification algorithms of Audomni, and finally presented in a pair of open-ear headphones that do not block out environmental sounds. The design of Audomni leverages the DoU-MoB to ensure user-centric development and evaluation, in the aim of reaching an aid with such form and function that it grants the users better mobility, while the users still want to use it.Audomni has been evaluated with user tests twice, once in pilot tests with two BLVIs, and once in VR with a heterogenous set of 19 BLVIs, utilizing the Parrot-VR and the DoUQ-MoB. 76 % of responders (13 / 17) answered that it was very or extremely likely that they would want use Audomni along with their current aid. This might be the first result in the field demonstrating a majority of blind and low-vision participants reporting that they actually want to use a new electronic travel aid. This shows promise that eventual long-term tests will demonstrate an increased mobility of blind and low-vision users — the overarching project aim. Such results would ultimately mean that Audomni can become an aid that alleviates societal cost, reduces burden on relatives, and improves users’ quality of life and independence
    corecore