2 research outputs found
DEVELOPMENT AND ASSESSMENT OF ADVANCED ASSISTIVE ROBOTIC MANIPULATORS USER INTERFACES
Text BoxAssistive Robotic Manipulators (ARM) have shown improvement in self-care and increased independence among people with severe upper extremity disabilities. With an ARM mounted on the side of an electric powered wheelchair, an ARM may provide manipulation assistance, such as picking up object, eating, drinking, dressing, reaching out, or opening doors. However, existing assessment tools are inconsistent between studies, time consuming, and unclear in clinical effectiveness. Therefore, in this research, we have developed an ADL task board evaluation tool that provides standardized, efficient, and reliable assessment of ARM performance. Among powered wheelchair users and able-bodied controls using two commercial ARM user interfaces – joystick and keypad, we found that there were statistical differences between both user interface performances, but no statistical difference was found in the cognitive loading. The ADL task board demonstrated highly correlated performance with an existing functional assessment tool, Wolf Motor Function Test. Through this study, we have also identified barriers and limits in current commercial user interfaces and developed smartphone and assistive sliding-autonomy user interfaces that yields improved performance. Testing results from our smartphone manual interface revealed statistically faster performance. The assistive sliding-autonomy interface helped seamlessly correct the error seen with autonomous functions.
The ADL task performance evaluation tool may help clinicians and researchers better access ARM user interfaces and evaluated the efficacy of customized user interfaces to improve performance. The smartphone manual interface demonstrated improved performance and the sliding-autonomy framework showed enhanced success with tasks without recalculating path planning and recognition
Recommended from our members
A virtual headstick for people with spinal cord injuries
This paper presents a virtual headstick system as an alternative to the conventional passive headstick for persons with limited upper extremity function. The system is composed of a pair of kinematically dissimilar master-slave robots with the master robot being operated by the user's head. At the remote site, the end-effector of the slave robot moves as if it were at the tip of an imaginary headstick attached to the user's head. A unique feature of this system is that through force-reflection, the virtual headstick provides the user with proprioceptive information as in a conventional headstick, but with an augmentation of workspace volume and additional mechanical power. This paper describes the test-bed development, system identification, bilateral control implementation, and system performance evaluation