2018
Report  Open Access

Local rule-based explanations of black box decision systems

Guidotti R., Monreale A., Ruggieri S., Pedreschi D., Turini F., Giannotti F.

Explanation  Decision Systems  Rules 

The recent years have witnessed the rise of accurate but obscure decision systems which hide the logic of their internal decision processes to the users. The lack of explanations for the decisions of black box systems is a key ethical issue, and a limitation to the adoption of machine learning components in socially sensitive and safety-critical contexts.% Therefore, we need explanations that reveals the reasons why a predictor takes a certain decision. In this paper we focus on the problem of black box outcome explanation, ie, explaining the reasons of the decision taken on a specific instance. We propose LORE, an agnostic method able to provide interpretable and faithful explanations. LORE first leans a local interpretable predictor on a synthetic neighborhood generated by a genetic algorithm. Then it derives from the logic of the local interpretable predictor a meaningful explanation consisting of: a decision rule, which explains the reasons of the decision; and a set of counterfactual rules, suggesting the changes in the instance's features that lead to a different outcome. Wide experiments show that LORE outperforms existing methods and baselines both in the quality of explanations and in the accuracy in mimicking the black box.

Source: ISTI Technical reports, 2018



Back to previous page
BibTeX entry
@techreport{oai:it.cnr:prodotti:397169,
	title = {Local rule-based explanations of black box decision systems},
	author = {Guidotti R. and Monreale A. and Ruggieri S. and Pedreschi D. and Turini F. and Giannotti F.},
	institution = {ISTI Technical reports, 2018},
	year = {2018}
}
CNR ExploRA

Bibliographic record

ISTI Repository

Deposited version Open Access

Also available from

arxiv.orgOpen Access

SoBigData
SoBigData Research Infrastructure


OpenAIRE