13 research outputs found

    Preliminary results toward a naturally controlled multi-synergistic prosthetic hand

    Get PDF
    Robotic hands embedding human motor control principles in their mechanical design are getting increasing interest thanks to their simplicity and robustness, combined with good performance. Another key aspect of these hands is that humans can use them very effectively thanks to the similarity of their behavior with real hands. Nevertheless, controlling more than one degree of actuation remains a challenging task. In this paper, we take advantage of these characteristics in a multi-synergistic prosthesis. We propose an integrated setup composed of Pisa/IIT SoftHand 2 and a control strategy which simultaneously and proportionally maps the human hand movements to the robotic hand. The control technique is based on a combination of non-negative matrix factorization and linear regression algorithms. It also features a real-time continuous posture compensation of the electromyographic signals based on an IMU. The algorithm is tested on five healthy subjects through an experiment in a virtual environment. In a separate experiment, the efficacy of the posture compensation strategy is evaluated on five healthy subjects and, finally, the whole setup is successfully tested in performing realistic daily life activities

    Human-Robot Team Interaction Through Wearable Haptics for Cooperative Manipulation

    Get PDF
    The interaction of robot teams and single human in teleoperation scenarios is beneficial in cooperative tasks, for example the manipulation of heavy and large objects in remote or dangerous environments. The main control challenge of the interaction is its asymmetry, arising because robot teams have a relatively high number of controllable degrees of freedom compared to the human operator. Therefore, we propose a control scheme that establishes the interaction on spaces of reduced dimensionality taking into account the low number of human command and feedback signals imposed by haptic devices. We evaluate the suitability of wearable haptic fingertip devices for multi-contact teleoperation in a user study. The results show that the proposed control approach is appropriate for human-robot team interaction and that the wearable haptic fingertip devices provide suitable assistance in cooperative manipulation tasks

    A COVID-19 Emergency Response for Remote Control of a Dialysis Machine with Mobile HRI

    Get PDF
    Healthcare workers face a high risk of contagion during a pandemic due to their close proximity to patients. The situation is further exacerbated in the case of a shortage of personal protective equipment that can increase the risk of exposure for the healthcare workers and even non-pandemic related patients, such as those on dialysis. In this study, we propose an emergency, non-invasive remote monitoring and control response system to retrofit dialysis machines with robotic manipulators for safely supporting the treatment of patients with acute kidney disease. Specifically, as a proof-of-concept, we mock-up the touchscreen instrument control panel of a dialysis machine and live-stream it to a remote user’s tablet computer device. Then, the user performs touch-based interactions on the tablet device to send commands to the robot to manipulate the instrument controls on the touchscreen of the dialysis machine. To evaluate the performance of the proposed system, we conduct an accuracy test. Moreover, we perform qualitative user studies using two modes of interaction with the designed system to measure the user task load and system usability and to obtain user feedback. The two modes of interaction included a touch-based interaction using a tablet device and a click-based interaction using a computer. The results indicate no statistically significant difference in the relatively low task load experienced by the users for both modes of interaction. Moreover, the system usability survey results reveal no statistically significant difference in the user experience for both modes of interaction except that users experienced a more consistent performance with the click-based interaction vs. the touch-based interaction. Based on the user feedback, we suggest an improvement to the proposed system and illustrate an implementation that corrects the distorted perception of the instrumentation control panel live-stream for a better and consistent user experience

    Body swarm interface (BOSI) : controlling robotic swarms using human bio-signals

    Get PDF
    Traditionally robots are controlled using devices like joysticks, keyboards, mice and other similar human computer interface (HCI) devices. Although this approach is effective and practical for some cases, it is restrictive only to healthy individuals without disabilities, and it also requires the user to master the device before its usage. It becomes complicated and non-intuitive when multiple robots need to be controlled simultaneously with these traditional devices, as in the case of Human Swarm Interfaces (HSI). This work presents a novel concept of using human bio-signals to control swarms of robots. With this concept there are two major advantages: Firstly, it gives amputees and people with certain disabilities the ability to control robotic swarms, which has previously not been possible. Secondly, it also gives the user a more intuitive interface to control swarms of robots by using gestures, thoughts, and eye movement. We measure different bio-signals from the human body including Electroencephalography (EEG), Electromyography (EMG), Electrooculography (EOG), using off the shelf products. After minimal signal processing, we then decode the intended control action using machine learning techniques like Hidden Markov Models (HMM) and K-Nearest Neighbors (K-NN). We employ formation controllers based on distance and displacement to control the shape and motion of the robotic swarm. Comparison for ground truth for thoughts and gesture classifications are done, and the resulting pipelines are evaluated with both simulations and hardware experiments with swarms of ground robots and aerial vehicles

    Gesture-based robot control with variable autonomy from the JPL BioSleeve

    No full text

    Gesture-Based Robot Control with Variable Autonomy from the JPL Biosleeve

    No full text
    This paper presents a new gesture-based human interface for natural robot control. Detailed activity of the user's hand and arm is acquired via a novel device, called the BioSleeve, which packages dry-contact surface electromyography (EMG) and an inertial measurement unit (IMU) into a sleeve worn on the forearm. The BioSleeve's accompanying algorithms can reliably decode as many as sixteen discrete hand gestures and estimate the continuous orientation of the forearm. These gestures and positions are mapped to robot commands that, to varying degrees, integrate with the robot's perception of its environment and its ability to complete tasks autonomously. This flexible approach enables, for example, supervisory point-to-goal commands, virtual joystick for guarded teleoperation, and high degree of freedom mimicked manipulation, all from a single device. The BioSleeve is meant for portable field use; unlike other gesture recognition systems, use of the BioSleeve for robot control is invariant to lighting conditions, occlusions, and the human-robot spatial relationship and does not encumber the user's hands. The BioSleeve control approach has been implemented on three robot types, and we present proof-of-principle demonstrations with mobile ground robots, manipulation robots, and prosthetic hands
    corecore