2,906 research outputs found
Recommended from our members
A Real Time Indoor Navigation and Monitoring System for Firefighters and Visually Impaired
ABSTRACT
A REAL TIME INDOOR NAVIGATION AND MONITORING SYSTEM FOR FIREFIGHTERS AND VISUALLY IMPAIRED
MAY 2011
SIDDHESH RAJAN GANDHI
M.S. E.C.E, UNIVERSITY OF MASSACHUSETTS AMHERST
Directed by: Professor Aura Ganz
There has been a widespread growth of technology in almost every facet of day to day life. But there are still important application areas in which technology advancements have not been implemented in a cost effective and user friendly manner. Such applications which we will address in this proposal include: 1) indoor localization and navigation of firefighters during rescue operations and 2) indoor localization and navigation for the blind and visually impaired population.
Firefighting is a dangerous job to perform as there can be several unexpected hazards while rescuing victims. Since the firefighters do not have any knowledge about the internal structure of the fire ridden building, they will not be able to find the location of the EXIT door, a fact that can prove to be fatal. We introduce an indoor location tracking and navigation system (FIREGUIDE) using RFID technology integrated with augmented reality. FIREGUIDE assists the firefighters to find the nearest exit location by providing the navigation instructions to the exits as well as an Augmented Reality view of the location and direction of the exits. The system also presents the Incident Commander the current firefighter’s location superimposed on a map of the building floor. We envision that the FIREGUIDE system will save a significant number of firefighters and victims’ lives.
Blind or visually impaired people find it difficult to navigate independently in both outdoor and indoor environments. The outdoor navigation problem can be solved by using systems that have GPS support. But indoor navigation systems for the blind or visually impaired are still a challenge to conquer, given the requirements of low cost and user friendly operation. In order to enhance the perception of indoor and unfamiliar environments for the blind and visually-impaired, as well as to aid in their navigation through such environments, we propose a novel approach that provides context–aware navigation services. INSIGHT uses RFID (Radio Frequency Identification), and tagged spaces (audio landmarks), enabling a ubiquitous computing system with contextual awareness of its users while providing them persistent and context-aware information. We present INSIGHT system that supports a number of unique features such as: a) Low deployment and maintenance cost; b) Scalability, i.e. we can deploy the system in very large buildings; c) An on-demand system that does not overwhelm the user, as it offers small amounts of information on demand; and d) Portability and ease-of-use, i.e., the custom handheld device carried by the user is compact and instructions are received audibly
A multimodal smartphone interface for active perception by visually impaired
The diffuse availability of mobile devices, such as smartphones and tablets, has the potential to bring substantial benefits to the people with sensory impairments. The solution proposed in this paper is part of an ongoing effort to create an accurate obstacle and hazard detector for the visually impaired, which is embedded in a hand-held device. In particular, it presents a proof of concept for a multimodal interface to control the orientation of a smartphone's camera, while being held by a person, using a combination of vocal messages, 3D sounds and vibrations. The solution, which is to be evaluated experimentally by users, will enable further research in the area of active vision with human-in-the-loop, with potential application to mobile assistive devices for indoor navigation of visually impaired people
Indoor assistance for visually impaired people using a RGB-D camera
In this paper a navigational aid for visually impaired people is presented. The system uses a RGB-D camera to perceive the environment and implements self-localization, obstacle detection and obstacle classification. The novelty of this work is threefold. First, self-localization is performed by means of a novel camera tracking approach that uses both depth and color information. Second, to provide the user with semantic information, obstacles are classified as walls, doors, steps and a residual class that covers isolated objects and bumpy parts on the floor. Third, in order to guarantee real time performance, the system is accelerated by offloading parallel operations to the GPU. Experiments demonstrate that the whole system is running at 9 Hz
Indoor Localization of Mobile Robots with Wireless Sensor Network Based on Ultra Wideband using Experimental Measurements of Time Difference of Arrival
This paper presents investigations into wireless localization techniques for mobile robots operating in indoor environments. Localization systems can guide robots to perform
different tasks such as monitoring children or elderly people, aid mobility of the visually impaired and localize mobile objects or packages in warehouses. They are essential for localization of robots operating in re-mote places that are inaccessible or hazardous to humans. Currently, ultra wide band (UWB) in indoor environments provides an accuracy of 24 mm under line of sight (LOS) or non-line of sight (NLOS) conditions in a working
range of 160 m indoors. The work presented in this paper carries out experimental validation of localization algorithms using mobile robots and UWB signals. These are measured in LOS and NLOS environments. The measurements are performed with the
UWB radio PulsON 410 (P410) and mobile robots (AmigoBot) with maximum travel-ling speed of 1 m/s and equipped with an on-board computer, sonar, odometer, camera and inertial
navigation system. Experimental results obtained for the system show positioning errors of less than 55 mm
- …