The extend of tool wear significantly affects blanking processes and has a
decisive impact on product quality and productivity. For this reason, numerous
scientists have addressed their research to wear monitoring systems in order to
identify or even predict critical wear at an early stage. Existing approaches
are mainly based on indirect monitoring using time series, which are used to
detect critical wear states via thresholds or machine learning models.
Nevertheless, differentiation between types of wear phenomena affecting the
tool during blanking as well as quantification of worn surfaces is still
limited in practice. While time series data provides partial insights into wear
occurrence and evolution, direct monitoring techniques utilizing image data
offer a more comprehensive perspective and increased robustness when dealing
with varying process parameters. However, acquiring and processing this data in
real-time is challenging. In particular, high dynamics combined with increasing
strokes rates as well as the high dimensionality of image data have so far
prevented the development of direct image-based monitoring systems. For this
reason, this paper demonstrates how high-resolution images of tools at 600 spm
can be captured and subsequently processed using semantic segmentation deep
learning algorithms, more precisely Fully Convolutional Networks (FCN). 125,000
images of the tool are taken from successive strokes, and microscope images are
captured to investigate the worn surfaces. Based on findings from the
microscope images, selected images are labeled pixel by pixel according to
their wear condition and used to train a FCN (U-Net)