1,687 research outputs found
Empowering and assisting natural human mobility: The simbiosis walker
This paper presents the complete development of the Simbiosis Smart Walker. The device is equipped with a set of sensor subsystems to acquire user-machine interaction forces and the temporal evolution of user's feet during gait. The authors present an adaptive filtering technique used for the identification and separation of different components found on the human-machine interaction forces. This technique allowed isolating the components related with the navigational commands and developing a Fuzzy logic controller to guide the device. The Smart Walker was clinically validated at the Spinal Cord Injury Hospital of Toledo - Spain, presenting great acceptability by spinal chord injury patients and clinical staf
Following a Robot using a Haptic Interface without Visual Feedback
Search and rescue operations are often undertaken in dark and noisy environments in which rescue teams must rely on haptic feedback for navigation and safe exit. In this paper, we discuss designing and evaluating a haptic interface to enable a human being to follow a robot through an environment with no-visibility. We first briefly analyse the task at hand and discuss the considerations that have led to our current interface design. The second part of the paper describes our testing procedure and the results of our first informal tests. Based on these results we discuss future improvements of our design
Sample-Efficient Training of Robotic Guide Using Human Path Prediction Network
Training a robot that engages with people is challenging, because it is
expensive to involve people in a robot training process requiring numerous data
samples. This paper proposes a human path prediction network (HPPN) and an
evolution strategy-based robot training method using virtual human movements
generated by the HPPN, which compensates for this sample inefficiency problem.
We applied the proposed method to the training of a robotic guide for visually
impaired people, which was designed to collect multimodal human response data
and reflect such data when selecting the robot's actions. We collected 1,507
real-world episodes for training the HPPN and then generated over 100,000
virtual episodes for training the robot policy. User test results indicate that
our trained robot accurately guides blindfolded participants along a goal path.
In addition, by the designed reward to pursue both guidance accuracy and human
comfort during the robot policy training process, our robot leads to improved
smoothness in human motion while maintaining the accuracy of the guidance. This
sample-efficient training method is expected to be widely applicable to all
robots and computing machinery that physically interact with humans
Smart Cane: Assistive Cane for Visually-impaired People
This paper reports on a study that helps visually-impaired people to walk
more confidently. The study hypothesizes that a smart cane that alerts
visually-impaired people over obstacles in front could help them in walking
with less accident. The aim of the paper is to address the development work of
a cane that could communicate with the users through voice alert and vibration,
which is named Smart Cane. T he development work involves coding and physical
installation. A series of tests have been carried out on the smart cane and the
results are discussed. This study found that the Smart Cane functions well as
intended, in alerting users about the obstacles in frontComment: 6 page
Overcoming barriers and increasing independence: service robots for elderly and disabled people
This paper discusses the potential for service robots to overcome barriers and increase independence of
elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly
people and advances in technology which will make new uses possible and provides suggestions for some of these new
applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses
the complementarity of assistive service robots and personal assistance and considers the types of applications and
users for which service robots are and are not suitable
State of the art review on walking support system for visually impaired people
The technology for terrain detection and walking support system for blind people has
rapidly been improved the last couple of decades but to assist visually impaired people may have
started long ago. Currently, a variety of portable or wearable navigation system is available in the
market to help the blind for navigating their way in his local or remote area. The focused
category in this work can be subgroups as electronic travel aids (ETAs), electronic orientation
aids (EOAs) and position locator devices (PLDs). However, we will focus mainly on electronic
travel aids (ETAs). This paper presents a comparative survey among the various portable or
wearable walking support systems as well as informative description (a subcategory of ETAs or
early stages of ETAs) with its working principal advantages and disadvantages so that the
researchers can easily get the current stage of assisting blind technology along with the
requirement for optimising the design of walking support system for its users
Using remote vision: The effects of video image frame rate on visual object recognition performance
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.The process of using remote vision was simulated in order to determine the effects of video image frame rate on the performance in visual recognition of stationary environmental hazards in the dynamic video footage of the pedestrian travel environment. The recognition performance was assessed against two different video image frame rate variations: 25 and 2 fps. The assessment included a range of objective and subjective criteria. The obtained results show that the effects of the frame rate variations on the performance are statistically insignificant. This paper belongs to the process of development of a novel system for navigation of visually impaired pedestrians. The navigation system includes a remote vision facility, and the visual recognition of the environmental hazards by the sighted human guide is a basic activity in aiding the visually impaired user of the system in mobility
Assisting Blind People’s Independence in Public Spaces
早稲田大学博士(工学)早大学位記番号:新9136doctoral thesi
Recommended from our members
Mobile assistive technologies for the visually impaired
There are around 285 million visually impaired people worldwide, and around 370,000 people are registered as blind or partially sighted in the UK. Ongoing advances in information technology (IT) are increasing the scope for IT-based mobile assistive technologies to facilitate the independence, safety, and improved quality of life of the visually impaired. Research is being directed at making mobile phones and other handheld devices accessible via our haptic (touch) and audio sensory channels. We review research and innovation within the field of mobile assistive technology for the visually impaired and, in so doing, highlight the need for successful collaboration between clinical expertise, computer science, and domain users to realize fully the potential benefits of such technologies. We initially reflect on research that has been conducted to make mobile phones more accessible to people with vision loss. We then discuss innovative assistive applications designed for the visually impaired that are either delivered via mainstream devices and can be used while in motion (e.g., mobile phones) or are embedded within an environment that may be in motion (e.g., public transport) or within which the user may be in motion (e.g., smart homes)
A Navigation and Augmented Reality System for Visually Impaired People
In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback
- …