2020
Journal article  Open Access

Evaluating local explanation methods on ground truth

Guidotti R.

Open the black box  Explainable AI  Interpretable models  Local explanation  Artificial Intelligence  Linguistics and Language  Evaluating explanations  Language and Linguistics 

Evaluating local explanation methods is a difficult task due to the lack of a shared and universally accepted definition of explanation. In the literature, one of the most common ways to assess the performance of an explanation method is to measure the fidelity of the explanation with respect to the classification of a black box model adopted by an Artificial Intelligent system for making a decision. However, this kind of evaluation only measures the degree of adherence of the local explainer in reproducing the behavior of the black box classifier with respect to the final decision. Therefore, the explanation provided by the local explainer could be different in the content even though it leads to the same decision of the AI system. In this paper, we propose an approach that allows to measure to which extent the explanations returned by local explanation methods are correct with respect to a synthetic ground truth explanation. Indeed, the proposed methodology enables the generation of synthetic transparent classifiers for which the reason for the decision taken, i.e., a synthetic ground truth explanation, is available by design. Experimental results show how the proposed approach allows to easily evaluate local explanations on the ground truth and to characterize the quality of local explanation methods . (c) 2020 The Author. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Source: ARTIFICIAL INTELLIGENCE, vol. 291


Metrics



Back to previous page
BibTeX entry
@article{oai:it.cnr:prodotti:445656,
	title = {Evaluating local explanation methods on ground truth},
	author = {Guidotti R.},
	doi = {10.1016/j.artint.2020.103428},
	year = {2020}
}

AI4EU
A European AI On Demand Platform and Ecosystem

TAILOR
Foundations of Trustworthy AI - Integrating Reasoning, Learning and Optimization

HumanE-AI-Net
HumanE AI Network

XAI
Science and technology for the explanation of AI decision making

SoBigData-PlusPlus
SoBigData++: European Integrated Infrastructure for Social Mining and Big Data Analytics


OpenAIRE