727 research outputs found

    A Wearable RFID-Based Navigation System for the Visually Impaired

    Full text link
    Recent studies have focused on developing advanced assistive devices to help blind or visually impaired people. Navigation is challenging for this community; however, developing a simple yet reliable navigation system is still an unmet need. This study targets the navigation problem and proposes a wearable assistive system. We developed a smart glove and shoe set based on radio-frequency identification technology to assist visually impaired people with navigation and orientation in indoor environments. The system enables the user to find the directions through audio feedback. To evaluate the device's performance, we designed a simple experimental setup. The proposed system has a simple structure and can be personalized according to the user's requirements. The results identified that the platform is reliable, power efficient, and accurate enough for indoor navigation.Comment: 6 pages, 6 figures, 3 table

    Vision-Based Tactile Paving Detection Method in Navigation Systems for Visually Impaired Persons

    Get PDF
    In general, a visually impaired person relies on guide canes in order to walk outside besides depending only on a tactile pavement as a warning and directional tool in order to avoid any obstructions or hazardous situations. However, still a lot of training is needed in order to recognize the tactile pattern, and it is quite difficult for persons who have recently become visually impaired. This chapter describes the development and evaluation of vision-based tactile paving detection method for visually impaired persons. Some experiments will be conducted on how it works to detect the tactile pavement and identify the shape of tactile pattern. In this experiment, a vision-based method is proposed by using MATLAB including the Arduino platform and speaker as guidance tools. The output of this system based on the result found from tactile detection in MATLAB then produces auditory output and notifies the visually impaired about the type of tactile detected. Consequently, the development of tactile pavement detection system can be used by visually impaired persons for easy detection and navigation purposes

    State of the art review on walking support system for visually impaired people

    Get PDF
    The technology for terrain detection and walking support system for blind people has rapidly been improved the last couple of decades but to assist visually impaired people may have started long ago. Currently, a variety of portable or wearable navigation system is available in the market to help the blind for navigating their way in his local or remote area. The focused category in this work can be subgroups as electronic travel aids (ETAs), electronic orientation aids (EOAs) and position locator devices (PLDs). However, we will focus mainly on electronic travel aids (ETAs). This paper presents a comparative survey among the various portable or wearable walking support systems as well as informative description (a subcategory of ETAs or early stages of ETAs) with its working principal advantages and disadvantages so that the researchers can easily get the current stage of assisting blind technology along with the requirement for optimising the design of walking support system for its users

    Indoor navigation for the visually impaired : enhancements through utilisation of the Internet of Things and deep learning

    Get PDF
    Wayfinding and navigation are essential aspects of independent living that heavily rely on the sense of vision. Walking in a complex building requires knowing exact location to find a suitable path to the desired destination, avoiding obstacles and monitoring orientation and movement along the route. People who do not have access to sight-dependent information, such as that provided by signage, maps and environmental cues, can encounter challenges in achieving these tasks independently. They can rely on assistance from others or maintain their independence by using assistive technologies and the resources provided by smart environments. Several solutions have adapted technological innovations to combat navigation in an indoor environment over the last few years. However, there remains a significant lack of a complete solution to aid the navigation requirements of visually impaired (VI) people. The use of a single technology cannot provide a solution to fulfil all the navigation difficulties faced. A hybrid solution using Internet of Things (IoT) devices and deep learning techniques to discern the patterns of an indoor environment may help VI people gain confidence to travel independently. This thesis aims to improve the independence and enhance the journey of VI people in an indoor setting with the proposed framework, using a smartphone. The thesis proposes a novel framework, Indoor-Nav, to provide a VI-friendly path to avoid obstacles and predict the user s position. The components include Ortho-PATH, Blue Dot for VI People (BVIP), and a deep learning-based indoor positioning model. The work establishes a novel collision-free pathfinding algorithm, Orth-PATH, to generate a VI-friendly path via sensing a grid-based indoor space. Further, to ensure correct movement, with the use of beacons and a smartphone, BVIP monitors the movements and relative position of the moving user. In dark areas without external devices, the research tests the feasibility of using sensory information from a smartphone with a pre-trained regression-based deep learning model to predict the user s absolute position. The work accomplishes a diverse range of simulations and experiments to confirm the performance and effectiveness of the proposed framework and its components. The results show that Indoor-Nav is the first type of pathfinding algorithm to provide a novel path to reflect the needs of VI people. The approach designs a path alongside walls, avoiding obstacles, and this research benchmarks the approach with other popular pathfinding algorithms. Further, this research develops a smartphone-based application to test the trajectories of a moving user in an indoor environment

    Comparative analysis of computer-vision and BLE technology based indoor navigation systems for people with visual impairments

    Get PDF
    Background: Considerable number of indoor navigation systems has been proposed to augment people with visual impairments (VI) about their surroundings. These systems leverage several technologies, such as computer-vision, Bluetooth low energy (BLE), and other techniques to estimate the position of a user in indoor areas. Computer-vision based systems use several techniques including matching pictures, classifying captured images, recognizing visual objects or visual markers. BLE based system utilizes BLE beacons attached in the indoor areas as the source of the radio frequency signal to localize the position of the user. Methods: In this paper, we examine the performance and usability of two computer-vision based systems and BLE-based system. The first system is computer-vision based system, called CamNav that uses a trained deep learning model to recognize locations, and the second system, called QRNav, that utilizes visual markers (QR codes) to determine locations. A field test with 10 blindfolded users has been conducted while using the three navigation systems. Results: The obtained results from navigation experiment and feedback from blindfolded users show that QRNav and CamNav system is more efficient than BLE based system in terms of accuracy and usability. The error occurred in BLE based application is more than 30% compared to computer vision based systems including CamNav and QRNav. Conclusions: The developed navigation systems are able to provide reliable assistance for the participants during real time experiments. Some of the participants took minimal external assistance while moving through the junctions in the corridor areas. Computer vision technology demonstrated its superiority over BLE technology in assistive systems for people with visual impairments. - 2019 The Author(s).Scopu

    Blind guide: anytime, anywhere

    Get PDF
    Sight dominates our mental life, more than any other sense. Even when we are just thinking about something the world, we end imagining what looks like. This rich visual experience is part of our lives. People need the vision for two complementary reasons. One of them is vision give us the knowledge to recognize objects in real time. The other reason is vision provides us the control one need to move around and interact with objects. Eyesight helps people to avoid dangers and navigate in our world. Blind people usually have enhanced accuracy and sensibility of their other natural senses to sense their surroundings. But sometimes this is not enough because the human senses can be affected by external sources of noise or disease. Without any foreign aid or device, sightless cannot navigate in the world. Many assistive tools have been developed to help blind people. White canes or guide dogs help blind in their navigation. Each device has their limitation. White canes cannot detect head level obstacles, drop-offs, and obstructions over a meter away. The training of a guide dog takes a long time, almost five years in some cases. The sightless also needs training and is not a solution for everybody. Taking care of a guide dog can be expensive and time consuming. Humans have developed technology for helping us in every aspect of our lives. The primary goal of technology is helping people to improve their quality of life. Technology can assist us with our limitations. Wireless sensor networks is a technology that has been used to help people with disabilities. In this dissertation, the author proposes a system based on this technology called Blind Guide. Blind Guide is an artifact that helps blind people to navigate in indoors or outdoors scenarios. The prototype is portable assuring that can be used anytime and anywhere. The system is composed of wireless sensors that can be used in different parts of the body. The sensors detect an obstacle and inform the user with an audible warning providing a safety walk to the users. A great feature about Blind Guide is its modularity. The system can adapt to the needs of the user and can be used in a combination with other solution. For example, Blind Guide can be used in conjunction with the white cane. The white cane detects obstacles below waist level and a Blind Guide wireless sensor in the forehead can detect obstacles at the head level. This feature is important because some sightless people feel uncomfortable without the white cane. The system is scalable giving us the opportunity to create a network of interconnected Blind Guide users. This network can store the exact location and description of the obstacles found by the users. This information is public for all users of this system. This feature reduces the time required for obstacle detection and consequent energy savings, thus increasing the autonomy of the solution. One of the main requirements for the development of this prototype was to design a low-cost solution that can be accessible for anyone around the world. All the components of the solution can provide a low-cost solution, easily obtainable and at a low cost. Technology makes our life easier and it must be available for anyone. Modularity, portability, scalability, the possibility to work in conjunction with other solutions, detecting objects that other solutions cannot, obstacle labeling, a network of identified obstacles and audible warnings are the main aspects of the Blind Guide system. All these aspects makes Blind Guide an anytime, anywhere solution for blind people. Blind Guide was tested with a group of volunteers. The volunteers were sightless and from different ages. The trials performed to the system show us positive results. The system successfully detected incoming obstacles and informed in real time to its users. The volunteers gave us a positive feedback telling that they felt comfortable using the prototype and they believe that the system can help them with their daily routine
    corecore