3 research outputs found
Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma
Although artificial intelligence (AI) systems have been shown to improve the
accuracy of initial melanoma diagnosis, the lack of transparency in how these
systems identify melanoma poses severe obstacles to user acceptance.
Explainable artificial intelligence (XAI) methods can help to increase
transparency, but most XAI methods are unable to produce precisely located
domain-specific explanations, making the explanations difficult to interpret.
Moreover, the impact of XAI methods on dermatologists has not yet been
evaluated. Extending on two existing classifiers, we developed an XAI system
that produces text and region based explanations that are easily interpretable
by dermatologists alongside its differential diagnoses of melanomas and nevi.
To evaluate this system, we conducted a three-part reader study to assess its
impact on clinicians' diagnostic accuracy, confidence, and trust in the
XAI-support. We showed that our XAI's explanations were highly aligned with
clinicians' explanations and that both the clinicians' trust in the support
system and their confidence in their diagnoses were significantly increased
when using our XAI compared to using a conventional AI system. The clinicians'
diagnostic accuracy was numerically, albeit not significantly, increased. This
work demonstrates that clinicians are willing to adopt such an XAI system,
motivating their future use in the clinic
tchanda90/Derma-XAI: Release
Full Changelog: https://github.com/tchanda90/Derma-XAI/commits/releas
Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma
Abstract Artificial intelligence (AI) systems have been shown to help dermatologists diagnose melanoma more accurately, however they lack transparency, hindering user acceptance. Explainable AI (XAI) methods can help to increase transparency, yet often lack precise, domain-specific explanations. Moreover, the impact of XAI methods on dermatologists’ decisions has not yet been evaluated. Building upon previous research, we introduce an XAI system that provides precise and domain-specific explanations alongside its differential diagnoses of melanomas and nevi. Through a three-phase study, we assess its impact on dermatologists’ diagnostic accuracy, diagnostic confidence, and trust in the XAI-support. Our results show strong alignment between XAI and dermatologist explanations. We also show that dermatologists’ confidence in their diagnoses, and their trust in the support system significantly increase with XAI compared to conventional AI. This study highlights dermatologists’ willingness to adopt such XAI systems, promoting future use in the clinic