research

Real-time Hybrid Locomotion Mode Recognition for Lower-limb Wearable Robots

Abstract

Real-time recognition of locomotion-related activities is a fundamental skill that the controller of lower-limb wearable robots should possess. Subject-specific training and reliance on electromyographic interfaces are the main limitations of existing approaches. This study presents a novel methodology for real-time locomotion mode recognition of locomotion-related activities in lower-limb wearable robotics. A hybrid classifier can distinguish among seven locomotion-related activities. First, a time-based approach classifies between static and dynamical states based on gait kinematics data. Second, an event-based fuzzy logic method triggered by foot pressure sensors operates in a subject-independent fashion on a minimal set of relevant biomechanical features to classify among dynamical modes. The locomotion mode recognition algorithm is implemented on the controller of a portable powered orthosis for hip assistance. An experimental protocol is designed to evaluate the controller performance in an out-of-lab scenario without the need for a subject-specific training. Experiments are conducted on six healthy volunteers performing locomotion-related activities at slow, normal, and fast speeds under the zero-torque and assistive mode of the orthosis. The overall accuracy rate of the controller is 99.4% over more than 10,000 steps, including seamless transitions between different modes. The experimental results show a successful subject-independent performance of the controller for wearable robots assisting locomotion-related activities

    Similar works