2,152 research outputs found

    Head movements based control of an intelligent wheelchair in an indoor environment

    Get PDF
    This paper presents a user-friendly human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW). Its two operation modes are based on head movements: Mode 1 uses only one head movement to give the commands, and Mode 2 employs four head movements. An EEG device, namely Emotiv EPOC, has been deployed in this HMI to obtain the head movement information of users. The proposed HMI is compared with the joystick control of an EPW in an indoor environment. The experimental results show that Control Mode 2 can be implemented at a fast speed reliably, achieving a mean time of 67.90 seconds for the two subjects. However, Control Mode 1 has inferior performance, achieving a mean time of 153.20 seconds for the two subjects although it needs only one head movement. It is clear that the proposed HMI can be effectively used to replace the traditional joystick control for disabled and elderly people

    Overcoming barriers and increasing independence: service robots for elderly and disabled people

    Get PDF
    This paper discusses the potential for service robots to overcome barriers and increase independence of elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly people and advances in technology which will make new uses possible and provides suggestions for some of these new applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses the complementarity of assistive service robots and personal assistance and considers the types of applications and users for which service robots are and are not suitable

    Collaborative Control for a Robotic Wheelchair: Evaluation of Performance, Attention, and Workload

    Get PDF
    Powered wheelchair users often struggle to drive safely and effectively and in more critical cases can only get around when accompanied by an assistant. To address these issues, we propose a collaborative control mechanism that assists the user as and when they require help. The system uses a multiple–hypotheses method to predict the driver’s intentions and if necessary, adjusts the control signals to achieve the desired goal safely. The main emphasis of this paper is on a comprehensive evaluation, where we not only look at the system performance, but, perhaps more importantly, we characterise the user performance, in an experiment that combines eye–tracking with a secondary task. Without assistance, participants experienced multiple collisions whilst driving around the predefined route. Conversely, when they were assisted by the collaborative controller, not only did they drive more safely, but they were able to pay less attention to their driving, resulting in a reduced cognitive workload. We discuss the importance of these results and their implications for other applications of shared control, such as brain–machine interfaces, where it could be used to compensate for both the low frequency and the low resolution of the user input

    Vision-based interface applied to assistive robots

    Get PDF
    This paper presents two vision-based interfaces for disabled people to command a mobile robot for personal assistance. The developed interfaces can be subdivided according to the algorithm of image processing implemented for the detection and tracking of two different body regions. The first interface detects and tracks movements of the user's head, and these movements are transformed into linear and angular velocities in order to command a mobile robot. The second interface detects and tracks movements of the user's hand, and these movements are similarly transformed. In addition, this paper also presents the control laws for the robot. The experimental results demonstrate good performance and balance between complexity and feasibility for real-time applications.Fil: Pérez Berenguer, María Elisa. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Soria, Carlos Miguel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: López Celani, Natalia Martina. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Nasisi, Oscar Herminio. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: Mut, Vicente Antonio. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentin

    Vision based interface system for hands free control of an intelligent wheelchair

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Due to the shift of the age structure in today's populations, the necessities for developing the devices or technologies to support them have been increasing. Traditionally, the wheelchair, including powered and manual ones, is the most popular and important rehabilitation/assistive device for the disabled and the elderly. However, it is still highly restricted especially for severely disabled. As a solution to this, the Intelligent Wheelchairs (IWs) have received considerable attention as mobility aids. The purpose of this work is to develop the IW interface for providing more convenient and efficient interface to the people the disability in their limbs.</p> <p>Methods</p> <p>This paper proposes an intelligent wheelchair (IW) control system for the people with various disabilities. To facilitate a wide variety of user abilities, the proposed system involves the use of face-inclination and mouth-shape information, where the direction of an IW is determined by the inclination of the user's face, while proceeding and stopping are determined by the shapes of the user's mouth. Our system is composed of electric powered wheelchair, data acquisition board, ultrasonic/infra-red sensors, a PC camera, and vision system. Then the vision system to analyze user's gestures is performed by three stages: detector, recognizer, and converter. In the detector, the facial region of the intended user is first obtained using Adaboost, thereafter the mouth region is detected based on edge information. The extracted features are sent to the recognizer, which recognizes the face inclination and mouth shape using statistical analysis and <it>K</it>-means clustering, respectively. These recognition results are then delivered to the converter to control the wheelchair.</p> <p>Result & conclusion</p> <p>The advantages of the proposed system include 1) accurate recognition of user's intention with minimal user motion and 2) robustness to a cluttered background and the time-varying illumination. To prove these advantages, the proposed system was tested with 34 users in indoor and outdoor environments and the results were compared with those of other systems, then the results showed that the proposed system has superior performance to other systems in terms of speed and accuracy. Therefore, it is proved that proposed system provided a friendly and convenient interface to the severely disabled people.</p

    Knowing when to assist: Developmental issues in lifelong assistive robotics

    Get PDF
    Children and adults with sensorimotor disabilities can significantly increase their autonomy through the use of assistive robots. As the field progresses from short-term, task-specific solutions to long-term, adaptive ones, new challenges are emerging. In this paper a lifelong methodological approach is presented, that attempts to balance the immediate context-specific needs of the user, with the long-term effects that the robots assistance can potentially have on the users developmental trajectory

    Wheelchair control by head motion

    Get PDF
    Electric wheelchairs are designed to aid paraplegics. Unfortunately, these can not be used by persons with higher degree of impairment, such as quadriplegics, i.e. persons that, due to age or illness, can not move any of the body parts, except of the head. Medical devices designed to help them are very complicated, rare and expensive. In this paper a microcontroller system that enables standard electric wheelchair control by head motion is presented. The system comprises electronic and mechanic components. A novel head motion recognition technique based on accelerometer data processing is designed. The wheelchair joystick is controlled by the system’s mechanical actuator. The system can be used with several different types of standard electric wheelchairs. It is tested and verified through an experiment performed within this paper

    IoT-based smart wheelchair system for physically impaired person / Muhammad Afiq Mohd Aizam... [et al.]

    Get PDF
    Disabled persons usually require an assistant to help them in their daily routines especially for their mobility. The limitation of being physically impaired affects the quality of life in executing their daily routine especially the ones with a wheelchair. Pushing a wheelchair has its own side effects for the user especially the person with hands and arms impairments. This paper aims to develop a smart wheelchair system integrated with home automation. With the advent of the Internet of Things (IoT), a smart wheelchair can be operated using voice command through the Google assistant Software Development Kit (SDK). The smart wheelchair system and the home automation of this study were powered by Raspberry Pi 3 B+ and NodeMCU, respectively. Voice input commands were processed by the Google assistant Artificial Intelligence Yourself (AIY) to steer the movement of wheelchair. Users were able to speak to Google to discover any information from the website. For the safety of the user, a streaming camera was added on the wheelchair. An improvement to the wheelchair system that was added on the wheelchair is its combination with the home automation to help the impaired person to control their home appliances through Blynk application. Observations on three voice tones (low, medium and high) of voice command show that the minimum voice intensity for this smart wheelchair system is 68.2 dB. Besides, the user is also required to produce a clear voice command to increase the system accuracy
    corecore