128 research outputs found

    Tongue Control of Upper-Limb Exoskeletons For Individuals With Tetraplegia

    Get PDF

    User Based Development and Test of the EXOTIC Exoskeleton:Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control

    Get PDF
    This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC

    Eyes-free tongue gesture and tongue joystick control of a five DOF upper-limb exoskeleton for severely disabled individuals

    Get PDF
    Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device—safely and intuitively—with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback

    Design and evaluation of a noninvasive tongue-computer interface for individuals with severe disabilities

    Get PDF
    Tongue-computer interfaces have shown the potential to control assistive devices developed for individuals with severe disabilities. However, current efficient tongue-computer interfaces require invasive methods for attaching the sensor activation units to the tongue, such as piercing. In this study, we propose a noninvasive tongue-computer interface to avoid the requirement of invasive activation unit attachment methods. We developed the noninvasive tongue-computer interface by integrating an activation unit on a frame, and mounting the frame on an inductive tongue-computer interface (ITCI). Thus, the users are able to activate the inductive sensors on the interface by positioning the activation unit with their tongue. They also do not need to remount the activation unit before each use. We performed pointing tests for controlling a computer cursor and number typing tests with two able-bodied participants, where one of them was experienced with using invasive tongue-computer interfaces and other one had no experience. We measured throughput and movement error for pointing tasks, and speed and accuracy for number typing tasks for the evaluation of the feasibility and performance of the developed noninvasive system. Results show that the inexperienced participant achieved similar results with the developed noninvasive tongue-computer interface compared to the current invasive version of the ITCI, while the experienced participant performed better with the invasive tongue-computer interface

    Semi-Autonomous Control of an Exoskeleton using Computer Vision

    Get PDF

    Multimodal interface for an intelligent wheelchair

    Get PDF
    Tese de mestrado integrado. Engenharia Informática e Computação. Universidade do Porto. Faculdade de Engenharia. 201

    Semi-autonomous robotic wheelchair controlled with low throughput human- machine interfaces

    Get PDF
    For a wide range of people with limited upper- and lower-body mobility, interaction with robots remains a challenging problem. Due to various health conditions, they are often unable to use standard joystick interface, most of wheelchairs are equipped with. To accommodate this audience, a number of alternative human-machine interfaces have been designed, such as single switch, sip-and-puff, brain-computer interfaces. They are known as low throughput interfaces referring to the amount of information that an operator can pass into the machine. Using them to control a wheelchair poses a number of challenges. This thesis makes several contributions towards the design of robotic wheelchairs controlled via low throughput human-machine interfaces: (1) To improve wheelchair motion control, an adaptive controller with online parameter estimation is developed for a differentially driven wheelchair. (2) Steering control scheme is designed that provides a unified framework integrating different types of low throughput human-machine interfaces with an obstacle avoidance mechanism. (3) A novel approach to the design of control systems with low throughput human-machine interfaces has been proposed. Based on the approach, position control scheme for a holonomic robot that aims to probabilistically minimize time to destination is developed and tested in simulation. The scheme is adopted for a real differentially driven wheelchair. In contrast to other methods, the proposed scheme allows to use prior information about the user habits, but does not restrict navigation to a set of pre-defined points, and parallelizes the inference and motion reducing the navigation time. (4) To enable the real time operation of the position control, a high-performance algorithm for single-source any-angle path planning on a grid has been developed. By abandoning the graph model and introducing discrete geometric primitives to represent the propagating wave front, we were able to design a planning algorithm that uses only integer addition and bit shifting. Experiments revealed a significant performance advantage. Several modifications, including optimal and multithreaded implementations, are also presented
    corecore