1 research outputs found

    A Comprehensive Comparison of Human Activity Recognition using Inertial Sensors

    Get PDF
    Wearables are becoming increasingly popular. Their built-in sensors (e.g., GPS, accelerometer, gyroscope, light) can provide useful data for Human Activity Recognition (HAR). During the past few years, many HAR models have been introduced with different accuracies and performance. These models have been applied in different areas such as health, fitness tracking, entertainment, or advertisement. Given that these HAR models run on wearables, which are resource-constrained, factors like inadequate preprocessing can negatively impact the overall HAR performance. While high accuracy is essential in some applications, the device’s battery life is highly critical to the end-user. Prior studies contain a plethora of activity recognition models and pre-processing techniques that show a very high recognition performance of these models. These results are mostly reported under a specified study setup different from others, making a fair comparison among them nearly impossible. Nevertheless, to date, very few studies have conducted a side-by-side performance analysis in HAR. Therefore, in this dissertation, we investigate some of the most used HAR techniques to understand their impact when developing an end-to-end HAR model to recognize gym exercises (e.g., ”treadmill, ”bicep-curl,” ”Russian-twist”). This study allows us to examine the accuracy performance yielding from 5 state-of-the-art featuresets in HAR models. Additionally, we focus on feature selection methods and experiment on data reduction to understand trade-offs between accuracy levels and data size. We find that histogram bins are a valid alternative featureset in HAR, with a significant positive impact on classification performance and classifier learning rate. Moreover, our finding shows that the data reduction techniques in the feature selection phase can decrease the data size by 93% (from 119 features to 8 features) with minimal impact on model performance, resulting in a large computation saving for the model
    corecore