KD-LSRED: knowledge distillation for lightweight symbol recognition in engineering diagrams.

Abstract

Engineering diagrams (EDs) provide a rich source of information and play a critical role across various industries. However, the inherent complexity of EDs complicates automatic analysis and processing. These diagrams often contain 100 to 200 visually similar symbols, leading to challenges such as inter-class similarity, overlapping symbols, and substantial background noise. Although recent deep learning-based architectures have shown promising performance in recognising these symbols, these models are heavy and computationally expensive, restricting their use for deployment on resource-constrained devices. Thus, we propose a lightweight knowledge distillation framework for EDs. The framework integrates feature-based and output-level distillation, enabling a lightweight student model to learn from a more complex teacher model. Feature-based distillation is enhanced through Channel-Wise Distillation (CWD) loss, improving spatial and contextual representation, and reducing computational complexity. The output-level distillation employs Kullback-Leibler (KL) divergence to closely align the student model predictions with the teacher’s probability distributions. Extensive experiments on private and public datasets demonstrate that our approach achieves a 6.9% improvement in mean average precision (mAP), reduces model size by 74.5% and decreases computational cost by 82.7%. The proposed method offers significant potential for real-time industrial applications on edge devices and sets the foundation for further advancements in ED analysis

Similar works

Full text

thumbnail-image

Open Access Institutional Repository at Robert Gordon University

redirect
Last time updated on 15/12/2025

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.