Low-Cost Objective Measurement of Prehension Skills

Abstract

This thesis aims to explore the feasibility of using low-cost, portable motion capture tools for the quantitative assessment of sequential 'reach-to-grasp' and repetitive 'finger-tapping' movements in neurologically intact and deficit populations, both in clinical and non-clinical settings. The research extends the capabilities of an existing optoelectronic postural sway assessment tool (PSAT) into a more general Boxed Infrared Gross Kinematic Assessment Tool (BIGKAT) to evaluate prehensile control of hand movements outside the laboratory environment. The contributions of this work include the validation of BIGKAT against a high-end motion capture system (Optotrak) for accuracy and precision in tracking kinematic data. BIGKAT was subsequently applied to kinematically resolve prehensile movements, where concurrent recordings with Optotrak demonstrate similar statistically significant results for five kinematic measures, two spatial measures (Maximum Grip Aperture – MGA, Peak Velocity – PV) and three temporal measures (Movement Time – MT, Time to MGA – TMGA, Time to PV – TPV). Regression analysis further establishes a strong relationship between BIGKAT and Optotrak, with nearly unity slope and low y-intercept values. Results showed reliable performance of BIGKAT and its ability to produce similar statistically significant results as Optotrak. BIGKAT was also applied to quantitatively assess bradykinesia in Parkinson's patients during finger-tapping movements. The system demonstrated significant differences between PD patients and healthy controls in key kinematic measures, paving the way for potential clinical applications. The study characterized kinematic differences in prehensile control in different sensory environments using a Virtual Reality head mounted display and finger tracking system (the Leap Motion), emphasizing the importance of sensory information during hand movements. This highlighted the role of hand vision and haptic feedback during initial and final phases of prehensile movement trajectory. The research also explored marker-less pose estimation using deep learning tools, specifically DeepLabCut (DLC), for reach-to-grasp tracking. Despite challenges posed by COVID-19 limitations on data collection, the study showed promise in scaling reaching and grasping components but highlighted the need for diverse datasets to resolve kinematic differences accurately. To facilitate the assessment of prehension activities, an Event Detection Tool (EDT) was developed, providing temporal measures for reaction time, reaching time, transport time, and movement time during object grasping and manipulation. Though initial pilot data was limited, the EDT holds potential for insights into disease progression and movement disorder severity. Overall, this work contributes to the advancement of low-cost, portable solutions for quantitatively assessing upper-limb movements, demonstrating the potential for wider clinical use and guiding future research in the field of human movement analysis

    Similar works