Discussion about this post

User's avatar
nicola leonardi's avatar

I think this sentence is not entirely correct: "For example, SHAP produces information at the level of individual instances". Indeed the beemswarm plot https://shap.readthedocs.io/en/latest/example_notebooks/api_examples/plots/beeswarm.html ca be used for a global analysis of the model outputs. Do you agree?

Expand full comment
Gennady Andrienko's avatar

Excellent writing, Enrico, thank you!

One comment: trees and rules looks to be interpretable by design, but they are NOT understandable in many real-world applications. The reasons are depth of trees, number of distinct variables involved in trees, and mismatch between high level user's concepts and low-level features used in explanations. We published a couple of papers on this matter recently:

- Re-interpreting Rules Interpretability, https://doi.org/10.1007/s41060-023-00398-5

- Visual Analytics for Human-Centered Machine Learning, https://doi.org/10.1109/MCG.2021.3130314

Let's discuss at IEEE VIS this year!

Best wishes,

Gennady

Expand full comment
1 more comment...

No posts