Identifying Unsafe Behavior of Construction Workers: A Dynamic Approach Combining Skeleton Information and Spatiotemporal Features

Abstract

Data Availability Statement: Some or all data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request.Vision-based methods for action recognition are valuable for supervising construction workers’ unsafe behaviors. However, current monitoring methods have limitations in extracting dynamic information about workers. Identifying hazardous actions based on the spatiotemporal relationships between workers’ skeletal points remains a significant challenge in construction sites. This paper proposed an automated method for recognizing dynamic hazardous actions. The method used the OpenPose network to extract workers’ skeleton information from the video and applied a spatiotemporal graph convolutional network (ST-GCN) to analyze the dynamic spatiotemporal relationships between workers’ body skeletons, enabling automatic recognition of hazardous actions. A novel human partitioning strategy and nonlocal attention mechanism were designed to assign appropriate weight parameters to different joints involved in actions, thereby improving the recognition accuracy of complex construction actions. The enhanced model is called the attention module spatiotemporal graph convolutional network (AM-STGCN). The method achieved a test accuracy of 90.50% and 87.08% in typical work scenarios, namely high-altitude scaffolding scenes with close-up and far views, surpassing the performance of the original ST-GCN model. The high-accuracy test results demonstrate that the model can accurately identify workers’ hazardous actions. The newly proposed model is inferred to have promising application prospects and the potential to be applied in broader construction scenarios for on-site monitoring of hazardous actions.National Natural Science Foundation of China (Grant No. 72071097); MOE (Ministry of Education in China) Project of Humanities and Social Sciences (Grant No.20YJAZH034); Foundation of Jiangsu University (Grant No. SZCY-014)

    Similar works