6,026 research outputs found

    A Low-Cost Tele-Presence Wheelchair System

    Full text link
    This paper presents the architecture and implementation of a tele-presence wheelchair system based on tele-presence robot, intelligent wheelchair, and touch screen technologies. The tele-presence wheelchair system consists of a commercial electric wheelchair, an add-on tele-presence interaction module, and a touchable live video image based user interface (called TIUI). The tele-presence interaction module is used to provide video-chatting for an elderly or disabled person with the family members or caregivers, and also captures the live video of an environment for tele-operation and semi-autonomous navigation. The user interface developed in our lab allows an operator to access the system anywhere and directly touch the live video image of the wheelchair to push it as if he/she did it in the presence. This paper also discusses the evaluation of the user experience

    Overcoming barriers and increasing independence: service robots for elderly and disabled people

    Get PDF
    This paper discusses the potential for service robots to overcome barriers and increase independence of elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly people and advances in technology which will make new uses possible and provides suggestions for some of these new applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses the complementarity of assistive service robots and personal assistance and considers the types of applications and users for which service robots are and are not suitable

    Collaborative Control for a Robotic Wheelchair: Evaluation of Performance, Attention, and Workload

    Get PDF
    Powered wheelchair users often struggle to drive safely and effectively and in more critical cases can only get around when accompanied by an assistant. To address these issues, we propose a collaborative control mechanism that assists the user as and when they require help. The system uses a multiple–hypotheses method to predict the driver’s intentions and if necessary, adjusts the control signals to achieve the desired goal safely. The main emphasis of this paper is on a comprehensive evaluation, where we not only look at the system performance, but, perhaps more importantly, we characterise the user performance, in an experiment that combines eye–tracking with a secondary task. Without assistance, participants experienced multiple collisions whilst driving around the predefined route. Conversely, when they were assisted by the collaborative controller, not only did they drive more safely, but they were able to pay less attention to their driving, resulting in a reduced cognitive workload. We discuss the importance of these results and their implications for other applications of shared control, such as brain–machine interfaces, where it could be used to compensate for both the low frequency and the low resolution of the user input

    Brain-Switches for Asynchronous Brain−Computer Interfaces: A Systematic Review

    Get PDF
    A brain–computer interface (BCI) has been extensively studied to develop a novel communication system for disabled people using their brain activities. An asynchronous BCI system is more realistic and practical than a synchronous BCI system, in that, BCI commands can be generated whenever the user wants. However, the relatively low performance of an asynchronous BCI system is problematic because redundant BCI commands are required to correct false-positive operations. To significantly reduce the number of false-positive operations of an asynchronous BCI system, a two-step approach has been proposed using a brain-switch that first determines whether the user wants to use an asynchronous BCI system before the operation of the asynchronous BCI system. This study presents a systematic review of the state-of-the-art brain-switch techniques and future research directions. To this end, we reviewed brain-switch research articles published from 2000 to 2019 in terms of their (a) neuroimaging modality, (b) paradigm, (c) operation algorithm, and (d) performance

    BCI-Based Navigation in Virtual and Real Environments

    Get PDF
    A Brain-Computer Interface (BCI) is a system that enables people to control an external device with their brain activity, without the need of any muscular activity. Researchers in the BCI field aim to develop applications to improve the quality of life of severely disabled patients, for whom a BCI can be a useful channel for interaction with their environment. Some of these systems are intended to control a mobile device (e. g. a wheelchair). Virtual Reality is a powerful tool that can provide the subjects with an opportunity to train and to test different applications in a safe environment. This technical review will focus on systems aimed at navigation, both in virtual and real environments.This work was partially supported by the Innovation, Science and Enterprise Council of the Junta de Andalucía (Spain), project P07-TIC-03310, the Spanish Ministry of Science and Innovation, project TEC 2011-26395 and by the European fund ERDF

    Vision based interface system for hands free control of an intelligent wheelchair

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Due to the shift of the age structure in today's populations, the necessities for developing the devices or technologies to support them have been increasing. Traditionally, the wheelchair, including powered and manual ones, is the most popular and important rehabilitation/assistive device for the disabled and the elderly. However, it is still highly restricted especially for severely disabled. As a solution to this, the Intelligent Wheelchairs (IWs) have received considerable attention as mobility aids. The purpose of this work is to develop the IW interface for providing more convenient and efficient interface to the people the disability in their limbs.</p> <p>Methods</p> <p>This paper proposes an intelligent wheelchair (IW) control system for the people with various disabilities. To facilitate a wide variety of user abilities, the proposed system involves the use of face-inclination and mouth-shape information, where the direction of an IW is determined by the inclination of the user's face, while proceeding and stopping are determined by the shapes of the user's mouth. Our system is composed of electric powered wheelchair, data acquisition board, ultrasonic/infra-red sensors, a PC camera, and vision system. Then the vision system to analyze user's gestures is performed by three stages: detector, recognizer, and converter. In the detector, the facial region of the intended user is first obtained using Adaboost, thereafter the mouth region is detected based on edge information. The extracted features are sent to the recognizer, which recognizes the face inclination and mouth shape using statistical analysis and <it>K</it>-means clustering, respectively. These recognition results are then delivered to the converter to control the wheelchair.</p> <p>Result & conclusion</p> <p>The advantages of the proposed system include 1) accurate recognition of user's intention with minimal user motion and 2) robustness to a cluttered background and the time-varying illumination. To prove these advantages, the proposed system was tested with 34 users in indoor and outdoor environments and the results were compared with those of other systems, then the results showed that the proposed system has superior performance to other systems in terms of speed and accuracy. Therefore, it is proved that proposed system provided a friendly and convenient interface to the severely disabled people.</p

    Implementation of target tracking in Smart Wheelchair Component System

    Get PDF
    Independent mobility is critical to individuals of any age. While the needs of many individuals with disabilities can be satisfied with power wheelchairs, some members of the disabled community find it difficult or impossible to operate a standard power wheelchair. This population includes, but is not limited to, individuals with low vision, visual field neglect, spasticity, tremors, or cognitive deficits. To meet the needs of this population, our group is involved in developing cost effective modularly designed Smart Wheelchairs. Our objective is to develop an assistive navigation system which will seamlessly integrate into the lifestyle of individual with disabilities and provide safe and independent mobility and navigation without imposing an excessive physical or cognitive load. The Smart Wheelchair Component System (SWCS) can be added to a variety of commercial power wheelchairs with minimal modification to provide navigation assistance. Previous versions of the SWCS used acoustic and infrared rangefinders to identify and avoid obstacles, but these sensors do not lend themselves to many desirable higher-level behaviors. To achieve these higher level behaviors we integrated a Continuously Adapted Mean Shift (CAMSHIFT) target tracking algorithm into the SWCS, along with the Minimal Vector Field Histogram (MVFH) obstacle avoidance algorithm. The target tracking algorithm provides the basis for two distinct operating modes: (1) a "follow-the-leader" mode, and (2) a "move to stationary target" mode.The ability to track a stationary or moving target will make smart wheelchairs more useful as a mobility aid, and is also expected to be useful for wheeled mobility training and evaluation. In addition to wheelchair users, the caregivers, clinicians, and transporters who provide assistance to wheelchair users will also realize beneficial effects of providing safe and independent mobility to wheelchair users which will reduce the level of assistance needed by wheelchair users
    corecore