A Lightweight Model LGCSPNet for Sitting Posture Risk Management Applications

Abstract

Current methods for sitting posture recognition typically follow a pipeline involving keypoint extraction and skeleton graph construction, followed by pose classification using Convolutional Neural Networks (CNNs) or Vision Transformers (ViTs). However, CNNs struggle to model long-range dependencies among keypoints, whereas ViTs suffer from high computational costs. Moreover, both approaches tend to introduce redundancy during feature modeling. To improve efficiency, some studies have explored direct classification using keypoint coordinates, but these methods often fail to balance high accuracy with computational efficiency. To this end, this paper proposes a new model LGCSPNet with lightweight graph convolution modules (LGC) and a contrastive learning module. Firstly, LGC enables efficient full-keypoint communication by shifting features across keypoint channels, allowing each keypoint to access global context at minimal computational cost. Building on this, LGC enhances sitting posture detection by computing 3D attention weights via a parameter-free energy function with a closed-form solution, enhancing feature learning for posturally significant keypoints. The contrastive learning module enhances differentiation between similar postures in different categories by strategically selecting feature samples. Experiments on public human posture datasets and our custom sitting posture dataset show that LGCSPNet has only 0.097M parameters while achieving a 99% recognition rate. It surpasses existing models in terms of parameter quantity and accuracy. Guided by ergonomic metrics, our model enables posture correction and mitigates long-term sitting-related injuries

Similar works

Full text

thumbnail-image

Heriot Watt Pure

redirect
Last time updated on 30/06/2025

This paper was published in Heriot Watt Pure.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.

Licence: http://creativecommons.org/licenses/by-nc-nd/4.0/