SAIV 2024

Presentation
Chair: Yuval Shapira Changed Time Slot

Generating Explanations for Decisions of Classifiers using Instance Abstractions

Chunxi Ji, Adnan Darwiche

on  Tue, 17:30in  Main Roomfor  30min

Abstract

The complete reason for a decision plays an important role when explaining decisions made by classifiers, as its prime implicants and implicates were shown to be two types of explanations that have been studied extensively in the literature. The first type explains why a decision was made and is known as a sufficient reason for the de- cision, also an abductive explanation or a PI-explanation. The second type explains why some other decision was not made and is known as a necessary reason for the decision, also a contrastive or counterfactual explanation. Recently, a generalization of the complete reason was dis- covered, resulting in a generalized notion of the complete reason called the general reason. Furthermore, some prime implicants and implicates of the general reason were shown to be improvements of the sufficient and necessary reasons that contain more information about decisions and the underlying classifiers. In this paper, we summarize and present the main findings about the complete reason and the general reason, which indicate why they can be considered as the abstractions of the instance and how they can be utilized to generate explanations for decisions.

 Overview  Program