The increasing integration of automation and artificial intelligence (AI) in non-destructive testing (NDT) is not only changing the inspection processes themselves, but also the way decisions are made. While technical systems can reduce error-proneness and support data processing, the ultimate responsibility remains with the human. This paper examines the role of intuition in decision-making and analyses typical errors of judgment using prospect theory and insights from cognitive psychology. It also shows how well-informed decisions can be supported in AI-supported NDT processes - through training, explainable systems, user-centred design, suitable metrics, and a targeted distribution of tasks between people and technology. Rather than replacing human intuition, AI systems should be designed to complement it. To engage effectively with such systems, inspectors require not only technical expertise, but also competencies in risk assessment, probabilistic reasoning, and critical reflection on both their own judgments and the outputs provided by AI
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.