Objective Assessment of the Finger Tapping Task in Parkinson's Disease and Control Subjects using Azure Kinect and Machine Learning

Abstract

Parkinson's disease (PD) is characterised by a progressive worsening of motor functionalities. In particular, limited hand dexterity strongly correlates with PD diagnosis and staging. Objective detection of alterations in hand motor skills would allow, for example, prompt identification of the disease, its symptoms and the definition of adequate medical treatments. Among the clinical assessment tasks to diagnose and stage PD from hand impairment, the Finger Tapping (FT) task is a well-established tool. This preliminary study exploits a single RGB-Depth camera (Azure Kinect) and Google MediaPipe Hands to track and assess the Finger Tapping task. The system includes several stages. First, hand movements are tracked from FT video recordings and used to extract a series of clinically-relevant features. Then, the most significant features are selected and used to train and test several Machine Learning (ML) models, to distinguish subjects with PD from healthy controls. To test the proposed system, 35 PD subjects and 60 healthy volunteers were recruited. The best-performing ML model achieved a 94.4% Accuracy and 98.4% Fl score in a Leave-One-Subject-Out validation. Moreover, different clusters with respect to spatial and temporal variability in the FT trials among PD subjects were identified. This result suggests the possibility of exploiting the proposed system to perform an even finer identification of subgroups among the PD population

    Similar works