5,288 research outputs found

    Active Object Search with a Mobile Device for People with Visual Impairments

    Get PDF
    Modern smartphones can provide a multitude of services to assist people with visual impairments, and their cameras in particular can be useful for assisting with tasks, such as reading signs or searching for objects in unknown environments. Previous research has looked at ways to solve these problems by processing the camera's video feed, but very little work has been done in actively guiding the user towards specific points of interest, maximising the effectiveness of the underlying visual algorithms. In this paper, we propose a control algorithm based on a Markov Decision Process that uses a smartphone’s camera to generate real-time instructions to guide a user towards a target object. The solution is part of a more general active vision application for people with visual impairments. An initial implementation of the system on a smartphone was experimentally evaluated with participants with healthy eyesight to determine the performance of the control algorithm. The results show the effectiveness of our solution and its potential application to help people with visual impairments find objects in unknown environments

    Active Vision-Based Guidance with a Mobile Device for People with Visual Impairments

    Get PDF
    The aim of this research is to determine whether an active-vision system with a human-in-the-loop can be implemented to guide a user with visual impairments in finding a target object. Active vision techniques have successfully been applied to various electro-mechanical object search and exploration systems to boost their effectiveness at a given task. However, despite the potential of intelligent visual sensor arrays to enhance a user’s vision capabilities and alleviate some of the impacts that visual deficiencies have on their day-to-day lives, active vision techniques with human-in-the-loop remains an open research topic. In this thesis, an active guidance system is presented, which uses visual input from an object detector and an initial understanding of a typical room layout to generate navigation cues that assist a user with visual impairments in finding a target object. A complete guidance system prototype is implemented, along with a new audio-based interface and a state-of-the-art object detector, onto a mobile device and evaluated with a set of users in real environments. The results show that an active guidance approach performs well compared to other unguided solutions. This research highlights the potential benefits of the proposed active guidance controller and audio interface, which could enhance current vision-based guidance systems and travel aids for people with visual impairments

    Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments

    Get PDF
    Sound perception is a fundamental skill for many people with severe sight impairments. The research presented in this paper is part of an ongoing project with the aim to create a mobile guidance aid to help people with vision impairments find objects within an unknown indoor environment. This system requires an effective non-visual interface and uses bone-conduction headphones to transmit audio instructions to the user. It has been implemented and tested with spatialised audio cues, which convey the direction of a predefined target in 3D space. We present an in-depth evaluation of the audio interface with several experiments that involve a large number of participants, both blindfolded and with actual visual impairments, and analyse the pros and cons of our design choices. In addition to producing results comparable to the state-of-the-art, we found that Fitts’s Law (a predictive model for human movement) provides a suitable a metric that can be used to improve and refine the quality of the audio interface in future mobile navigation aids

    Towards human technology symbiosis in the haptic mode

    Get PDF
    Search and rescue operations are often undertaken in dark and noisy environments in which rescue teams must rely on haptic feedback for exploration and safe exit. However, little attention has been paid specifically to haptic sensitivity in such contexts or to the possibility of enhancing communicational proficiency in the haptic mode as a life-preserving measure. Here we discuss the design of a haptic guide robot, inspired by careful study of the communication between blind person and guide dog. In the case of this partnership, the development of a symbiotic relationship between person and dog, based on mutual trust and confidence, is a prerequisite for successful task performance. We argue that a human-technology symbiosis is equally necessary and possible in the case of the robot guide. But this is dependent on the robot becoming 'transparent technology' in Andy Clark's sense. We report on initial haptic mode experiments in which a person uses a simple mobile mechanical device (a metal disk fixed with a rigid handle) to explore the immediate environment. These experiments demonstrate the extreme sensitivity and trainability of haptic communication and the speed with which users develop and refine their haptic proficiencies in using the device, permitting reliable and accurate discrimination between objects of different weights. We argue that such trials show the transformation of the mobile device into a transparent information appliance and the beginnings of the development of a symbiotic relationship between device and human user. We discuss how these initial explorations may shed light on the more general question of how a human mind, on being exposed to an unknown environment, may enter into collaboration with an external information source in order to learn about, and navigate, that environment

    Evaluation studies of robotic rollators by the user perspective: A systematic review

    Get PDF
    Background: Robotic rollators enhance the basic functions of established devices by technically advanced physical, cognitive, or sensory support to increase autonomy in persons with severe impairment. In the evaluation of such Ambient Assisted Living solutions, both the technical and user perspectives are important to prove usability, effectiveness, and safety, and to ensure adequate device application.Objective: The aim of this systematic review is to summarize the methodology of studies evaluating robotic rollators with focus on the user perspective and to give recommendations for future evaluation studies.Methods: A systematic literature search up to December 31, 2014 was conducted based on the Cochrane Review methodology using the electronic databases PubMed and IEEE Xplore. Articles were selected according to the following inclusion criteria: Evaluation studies of robotic rollators documenting human-robot interaction, no case reports, published in English language.Results: Twenty-eight studies were identified that met the predefined inclusion criteria. Large heterogeneity in the definitions of the target user group, study populations, study designs, and assessment methods was found across the included studies. No generic methodology to evaluate robotic rollators could be identified. We found major methodological shortcomings related to insufficient sample descriptions and sample sizes, and lack of appropriate, standardized and validated assessment methods. Long-term use in habitual environment was also not evaluated.Conclusions: Apart from the heterogeneity, methodological deficits in most of the identified studies became apparent. Recommendations for future evaluation studies include: clear definition of target user group, adequate selection of subjects, inclusion of other assistive mobility devices for comparison, evaluation of the habitual use of advanced prototypes, adequate assessment strategy with established, standardized and validated methods, and statistical analysis of study results. Assessment strategies may additionally focus on specific functionalities of the robotic rollators allowing an individually tailored assessment of innovative features to document their added value
    • …
    corecore