-
Notifications
You must be signed in to change notification settings - Fork 311
ACM FAT*2020 Tutorial
AI Explainability 360
Date: Monday, January 27, 2020
Time:
13:00-14:30: Tutorial Session 1
14:30-15:00: Coffee break
15:00-16:30: Tutorial Session 2
Location: Barcelona, Spain at the Barceló Sants hotel, Room MR7.
Presenters: Vijay Arya, Amit Dhurandhar, Dennis Wei
ACM FAT* Tutorials [link].
ACM FAT* Program Schedule [link].
This tutorial will teach participants to use and contribute to a new open-source Python package named AI Explainability 360 (AIX360), a comprehensive and extensible toolkit that supports interpretability and explainability of data and machine learning models.
Our aim is to help educate multiple audiences, including data scientists working in different application domains, social scientists, domain experts, as well as machine learning researchers. To this end, we will present an overview of AI explainability, common terminology, an interactive web demo, and detailed Jupyter notebooks covering use cases in different domains.
A major motivation for creating AIX360 is that there are many ways to explain: data vs. model, direct vs. post-hoc, local vs. global. We will present a taxonomy to help practitioners navigate the explainability space. The toolkit itself includes eight state-of-the-art algorithms covering different modes of explanation along with proxy explainability metrics, and the interactive web demo will feature three of these algorithms.
- Introduction to AI explainability and AIX360
- Background and glossary
- Interactive web demo
- Taxonomy for choosing explanation algorithms
- Jupyter notebook examples
- Consumer lending
- Health and nutrition
- Medical images (dermoscopy)
- Future directions
The tutorial will be aimed at an audience with different backgrounds and computer science expertise levels. For all audience members and especially those unfamiliar with Python programming, the interactive web demo will serve as a grounded introduction to concepts and capabilities. Through the explainability taxonomy, we will teach all participants which type of explanation method is most appropriate for a given use case, which is beneficial regardless of technical background. The three Jupyter notebook examples in different application domains will allow data scientists and developers to gain hands-on experience with the toolkit, while others will be able to follow along by viewing rendered versions of the notebooks.
-
Please join the Slack channel dedicated to this tutorial. It contains important information to aid you with this hands-on tutorial, including the installation guide we will be using.
- Join the AIX360 Slack channel: Instructions
- Subscribe to the #fat-tutorial-2020 channel. You can do this by clicking on "Channels" in Slack and searching for "fat-tutorial-2020".
-
Please bring your laptop as this is a hands-on tutorial.
-
Please install the anaconda python distribution and AIX360 library ahead of time by following these instructions:
- Install anaconda python distribution by following instructions here:
https://www.anaconda.com/distribution/#download-section (Python 3.x version) - Create a python virtual env, clone AIX360 github repository, and install it by following instructions here:
https://github.com/IBM/AIX360#setup - Run a jupyter notebook server locally on your machine by following instructions here:
https://jupyter-notebook.readthedocs.io/en/stable/notebook.html#starting-the-notebook-server - Register and download FICO and ISIC datasets:
- FICO Dataset
- ISIC Dataset(-> "Participate in this phase" -> login (e.g. using your google/github account) -> Download training and ground truth data zip files)
- Try running the following jupyter notebooks locally on your machine:
- Consumer lending: https://github.com/IBM/AIX360/blob/master/examples/tutorials/HELOC.ipynb
- Health and nutrition: https://github.com/IBM/AIX360/blob/master/examples/tutorials/CDC.ipynb
- Medical images: https://github.com/IBM/AIX360/blob/master/examples/tutorials/dermoscopy.ipynb
- Install anaconda python distribution by following instructions here:
Thanks!