Maximum Entropy Baseline for Integrated Gradients
Integrated Gradients (IG), one of the most popular explainability methods available, still remains ambiguous in the selection of baseline, which may seriously impair the credibility of the explanations. This study proposes a new uniform baseline, i.e., the Maximum Entropy Baseline, which is consistent with the “uninformative” property of baselines defined in IG. In addition, we propose an improved ablating evaluation approach incorporating the new baseline, where the information conservativeness is maintained. We explain the linear transformation invariance of IG baselines from an information perspective. Finally, we assess the reliability of the explanations generated by different explainability methods and different IG baselines through extensive evaluation experiments.
- Published in:
2023 International Joint Conference on Neural Networks (IJCNN) - Type:
Inproceedings - Authors:
Tan, Hanxiao - Year:
2023 - Source:
https://ieeexplore.ieee.org/document/10191554
Citation information
Tan, Hanxiao: Maximum Entropy Baseline for Integrated Gradients, 2023 International Joint Conference on Neural Networks (IJCNN), 2023, https://ieeexplore.ieee.org/document/10191554, Tan.2023a,
@Inproceedings{Tan.2023a,
author={Tan, Hanxiao},
title={Maximum Entropy Baseline for Integrated Gradients},
booktitle={2023 International Joint Conference on Neural Networks (IJCNN)},
url={https://ieeexplore.ieee.org/document/10191554},
year={2023},
abstract={Integrated Gradients (IG), one of the most popular explainability methods available, still remains ambiguous in the selection of baseline, which may seriously impair the credibility of the explanations. This study proposes a new uniform baseline, i.e., the Maximum Entropy Baseline, which is consistent with the "uninformative" property of baselines defined in IG. In addition, we propose an...}}