1,583 research outputs found

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Principles of human movement augmentation and the challenges in making it a reality

    Get PDF
    Augmenting the body with artificial limbs controlled concurrently to one's natural limbs has long appeared in science fiction, but recent technological and neuroscientific advances have begun to make this possible. By allowing individuals to achieve otherwise impossible actions, movement augmentation could revolutionize medical and industrial applications and profoundly change the way humans interact with the environment. Here, we construct a movement augmentation taxonomy through what is augmented and how it is achieved. With this framework, we analyze augmentation that extends the number of degrees-of-freedom, discuss critical features of effective augmentation such as physiological control signals, sensory feedback and learning as well as application scenarios, and propose a vision for the field

    User Intent Detection and Control of a Soft Poly-Limb

    Get PDF
    abstract: This work presents the integration of user intent detection and control in the development of the fluid-driven, wearable, and continuum, Soft Poly-Limb (SPL). The SPL utilizes the numerous traits of soft robotics to enable a novel approach to provide safe and compliant mobile manipulation assistance to healthy and impaired users. This wearable system equips the user with an additional limb made of soft materials that can be controlled to produce complex three-dimensional motion in space, like its biological counterparts with hydrostatic muscles. Similar to the elephant trunk, the SPL is able to manipulate objects using various end effectors, such as suction adhesion or a soft grasper, and can also wrap its entire length around objects for manipulation. User control of the limb is demonstrated using multiple user intent detection modalities. Further, the performance of the SPL studied by testing its capability to interact safely and closely around a user through a spatial mobility test. Finally, the limb’s ability to assist the user is explored through multitasking scenarios and pick and place tests with varying mounting locations of the arm around the user’s body. The results of these assessments demonstrate the SPL’s ability to safely interact with the user while exhibiting promising performance in assisting the user with a wide variety of tasks, in both work and general living scenarios.Dissertation/ThesisMasters Thesis Biomedical Engineering 201

    Enabling Human-Robot Collaboration via Holistic Human Perception and Partner-Aware Control

    Get PDF
    As robotic technology advances, the barriers to the coexistence of humans and robots are slowly coming down. Application domains like elderly care, collaborative manufacturing, collaborative manipulation, etc., are considered the need of the hour, and progress in robotics holds the potential to address many societal challenges. The future socio-technical systems constitute of blended workforce with a symbiotic relationship between human and robot partners working collaboratively. This thesis attempts to address some of the research challenges in enabling human-robot collaboration. In particular, the challenge of a holistic perception of a human partner to continuously communicate his intentions and needs in real-time to a robot partner is crucial for the successful realization of a collaborative task. Towards that end, we present a holistic human perception framework for real-time monitoring of whole-body human motion and dynamics. On the other hand, the challenge of leveraging assistance from a human partner will lead to improved human-robot collaboration. In this direction, we attempt at methodically defining what constitutes assistance from a human partner and propose partner-aware robot control strategies to endow robots with the capacity to meaningfully engage in a collaborative task

    Supernumerary Robotic Fingers as a Therapeutic Device for Hemiparetic Patients

    Get PDF
    Patients with hemiparesis often have limited functionality in the left or right hand. The standard therapeutic approach requires the patient to attempt to make use of the weak hand even though it is not functionally capable, which can result in feelings of frustration. Furthermore, hemiparetic patients also face challenges in completing many bimanual tasks, for example walker manipulation, that are critical to patients’ independence and quality of life. A prototype therapeutic device with two supernumerary robotic fingers was used to determine if robotic fingers could functionally assist a human in the performance of bimanual tasks by observing the pose of the healthy hand. Specific focus was placed on the identification of a straightforward control routine which would allow a patient to carry out simple manipulation tasks with some intermittent input from a therapist. Part of this routine involved allowing a patient to switch between active and inactive monitoring of hand position, resulting in additional manipulation capabilities. The prototype successfully enabled a test subject to complete various bimanual tasks using the robotic fingers in place of normal hand motions. From these results, it is clear that the device could allow a hemiparetic patient to complete tasks which would previously have been impossible to perform

    Wearable sensors for human–robot walking together

    Get PDF
    Thanks to recent technological improvements that enable novel applications beyond the industrial context, there is growing interest in the use of robots in everyday life situations. To improve the acceptability of personal service robots, they should seamlessly interact with the users, understand their social signals and cues and respond appropriately. In this context, a few proposals were presented to make robots and humans navigate together naturally without explicit user control, but no final solution has been achieved yet. To make an advance toward this end, this paper proposes the use of wearable Inertial Measurement Units to improve the interaction between human and robot while walking together without physical links and with no restriction on the relative position between the human and the robot. We built a prototype system, experimented with 19 human participants in two different tasks, to provide real-time evaluation of gait parameters for a mobile robot moving together with a human, and studied the feasibility and the perceived usability by the participants. The results show the feasibility of the system, which obtained positive feedback from the users, giving valuable information for the development of a natural interaction system where the robot perceives human movements by means of wearable sensors

    A novel approach to user controlled ambulation of lower extremity exoskeletons using admittance control paradigm

    Get PDF
    The robotic lower extremity exoskeletons address the ambulatory problems confronting individuals with paraplegia. Paraplegia due to spinal cord injury (SCI) can cause motor deficit to the lower extremities leading to inability to walk. Though wheelchairs provide mobility to the user, they do not provide support to all activities of everyday living to individuals with paraplegia. Current research is addressing the issue of ambulation through the use of wearable exoskeletons that are pre-programmed. There are currently four exoskeletons in the U.S. market: Ekso, Rewalk, REX and Indego. All of the currently available exoskeletons have 2 active Degrees of Freedom (DOF) except for REX which has 5 active DOF. All of them have pre-programmed gait giving the user the ability to initiate a gait but not the ability to control the stride amplitude (height), stride frequency or stride length, and hence restricting users’ ability to navigate across different surfaces and obstacles that are commonly encountered in the community. Most current exoskeletons do not have motors for abduction or adduction to provide users with the option for movement in coronal plane, hence restricting user’s ability to effectively use the exoskeletons. These limitations of currently available pre-programmed exoskeleton models are sought to be overcome by an intuitive, real time user-controlled control mechanism employing admittance control by using hand-trajectory as a surrogate for foot trajectory. Preliminary study included subjects controlling the trajectory of the foot in a virtual environment using their contralateral hand. The study proved that hands could produce trajectories similar to human foot trajectories when provided with haptic and visual feedback. A 10 DOF 1/2 scale biped robot was built to test the control paradigm. The robot has 5 DOF on each leg with 2 DOF at the hip to provide flexion/extension and abduction/adduction, 1 DOF at the knee to provide flexion and 2 DOF at the ankle to provide flexion/extension and inversion/eversion. The control mechanism translates the trajectory of each hand into the trajectory of the ipsilateral foot in real time, thus providing the user with the ability to control each leg in both sagittal and coronal planes using the admittance control paradigm. The efficiency of the control mechanism was evaluated in a study using healthy subjects controlling the robot on a treadmill. A trekking pole was attached to each foot of the biped. The subjects controlled the trajectory of the foot of the biped by applying small forces in the direction of the required movement to the trekking pole through a force sensor. The algorithm converted the forces to Cartesian position of the foot in real time using admittance control; the Cartesian position was converted to joint angles of the hip and knee using inverse kinematics. The kinematics, synchrony and smoothness of the trajectory produced by the biped robot was evaluated at different speeds, with and without obstacles, and compared with typical walking by human subjects on the treadmill. Further, the cognitive load required to control the biped on the treadmill was evaluated and the effect of speed and obstacles with cognitive load on the kinematics, synchrony and smoothness was analyzed

    Body-Borne Computers as Extensions of Self

    Get PDF
    The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated with the body can offer. This paper identifies different categories of research aimed at augmenting humans. The paper specifically focuses on three areas of augmentation of the human body and its sensorimotor capabilities: physical morphology, skin display, and somatosensory extension. We discuss how such digital extensions relate to the malleable nature of our self-image. We argue that body-borne devices are no longer simply functional apparatus, but offer a direct interplay with the mind. Finally, we also showcase some of our own projects in this area and shed light on future challenges
    • 

    corecore