1,482 research outputs found

    Empowering and assisting natural human mobility: The simbiosis walker

    Get PDF
    This paper presents the complete development of the Simbiosis Smart Walker. The device is equipped with a set of sensor subsystems to acquire user-machine interaction forces and the temporal evolution of user's feet during gait. The authors present an adaptive filtering technique used for the identification and separation of different components found on the human-machine interaction forces. This technique allowed isolating the components related with the navigational commands and developing a Fuzzy logic controller to guide the device. The Smart Walker was clinically validated at the Spinal Cord Injury Hospital of Toledo - Spain, presenting great acceptability by spinal chord injury patients and clinical staf

    Exploring the Use of Wearables to Enable Indoor Navigation for Blind Users

    Get PDF
    One of the challenges that people with visual impairments (VI) have to have to confront daily, is navigating independently through foreign or unfamiliar spaces.Navigating through unfamiliar spaces without assistance is very time consuming and leads to lower mobility. Especially in the case of indoor environments where the use of GPS is impossible, this task becomes even harder.However, advancements in mobile and wearable computing pave the path to new cheap assistive technologies that can make the lives of people with VI easier.Wearable devices have great potential for assistive applications for users who are blind as they typically feature a camera and support hands and eye free interaction. Smart watches and heads up displays (HUDs), in combination with smartphones, can provide a basis for development of advanced algorithms, capable of providing inexpensive solutions for navigation in indoor spaces. New interfaces are also introduced making the interaction between users who are blind and mo-bile devices more intuitive.This work presents a set of new systems and technologies created to help users with VI navigate indoor environments. The first system presented is an indoor navigation system for people with VI that operates by using sensors found in mo-bile devices and virtual maps of the environment. The second system presented helps users navigate large open spaces with minimum veering. Next a study is conducted to determine the accuracy of pedometry based on different body placements of the accelerometer sensors. Finally, a gesture detection system is introduced that helps communication between the user and mobile devices by using sensors in wearable devices

    Human-Robot-Environment Interaction Strategies For Walker-assisted Gait

    Get PDF
    Smart Walkers (SWs) are robotic devices that may be used to improve balance and locomotion stability of people with lower-limb weakness or poor balance. Such devices may also offer support for cognitive disabilities and for people that cannot safely use conventional walkers, as well as allow interaction with other individuals and with the environment. In this context, there is a significant need to involve the environment information into the SW's control strategies. In this Ph.D. thesis, the concept of Human-Robot-Environment Interaction (HREI) for human locomotion assistance with a smart walker developed at UFES/Brazil (turned UFES's Smart Walker - USW) is explored. Two control strategies and one social navigation strategy are presented. The first control strategy is an admittance controller that generates haptic signals to induce the tracking of a predetermined path. When deviating from such path, the proposed method varies the damping parameter of the admittance controller by means of a spatial modulation technique, resulting in a haptic feedback, when is perceived by the user as a hard locomotion towards the undesired direction. The second strategy also uses an admittance controller to generate haptic signals, which guide the user along a predetermined path. However, in this case, the angular velocity of the smart walker is implemented as a function of a virtual torque, which is defined using two virtual forces that depend on the angular orientation error between the walker and the desired path. Regarding the navigation strategy, it involves social conventions defined by proxemics, and haptic signals generated through the spatial modulation of the admittance controller for a safe navigation within confined spaces. The USW uses a multimodal cognitive interaction composed of a haptic feedback and a visual interface with two LEDs to indicate the correct/desired direction when necessary. The proposed control strategies are suitable for a natural HREI as demonstrated in the experimental validation. Moreover, this Ph.D. thesis presents a strategy to obtain navigation commands for the USW based on multi-axial force sensors, in addition to a study of the admittance control parameters and its influence on the maneuverability of the USW, in order to improve its HREI

    Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device

    Get PDF
    This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.Andrea Bocelli FoundationNational Science Foundation (U.S.) (Grant NSF IIS1226883

    Comparative analysis of computer-vision and BLE technology based indoor navigation systems for people with visual impairments

    Get PDF
    Background: Considerable number of indoor navigation systems has been proposed to augment people with visual impairments (VI) about their surroundings. These systems leverage several technologies, such as computer-vision, Bluetooth low energy (BLE), and other techniques to estimate the position of a user in indoor areas. Computer-vision based systems use several techniques including matching pictures, classifying captured images, recognizing visual objects or visual markers. BLE based system utilizes BLE beacons attached in the indoor areas as the source of the radio frequency signal to localize the position of the user. Methods: In this paper, we examine the performance and usability of two computer-vision based systems and BLE-based system. The first system is computer-vision based system, called CamNav that uses a trained deep learning model to recognize locations, and the second system, called QRNav, that utilizes visual markers (QR codes) to determine locations. A field test with 10 blindfolded users has been conducted while using the three navigation systems. Results: The obtained results from navigation experiment and feedback from blindfolded users show that QRNav and CamNav system is more efficient than BLE based system in terms of accuracy and usability. The error occurred in BLE based application is more than 30% compared to computer vision based systems including CamNav and QRNav. Conclusions: The developed navigation systems are able to provide reliable assistance for the participants during real time experiments. Some of the participants took minimal external assistance while moving through the junctions in the corridor areas. Computer vision technology demonstrated its superiority over BLE technology in assistive systems for people with visual impairments. - 2019 The Author(s).Scopu

    A Systematic Review of Extended Reality (XR) for Understanding and Augmenting Vision Loss

    Full text link
    Over the past decade, extended reality (XR) has emerged as an assistive technology not only to augment residual vision of people losing their sight but also to study the rudimentary vision restored to blind people by a visual neuroprosthesis. To make the best use of these emerging technologies, it is valuable and timely to understand the state of this research and identify any shortcomings that are present. Here we present a systematic literature review of 227 publications from 106 different venues assessing the potential of XR technology to further visual accessibility. In contrast to other reviews, we sample studies from multiple scientific disciplines, focus on augmentation of a person's residual vision, and require studies to feature a quantitative evaluation with appropriate end users. We summarize prominent findings from different XR research areas, show how the landscape has changed over the last decade, and identify scientific gaps in the literature. Specifically, we highlight the need for real-world validation, the broadening of end-user participation, and a more nuanced understanding of the suitability and usability of different XR-based accessibility aids. By broadening end-user participation to early stages of the design process and shifting the focus from behavioral performance to qualitative assessments of usability, future research has the potential to develop XR technologies that may not only allow for studying vision loss, but also enable novel visual accessibility aids with the potential to impact the lives of millions of people living with vision loss
    corecore