Skip to content
/ eli5 Public

A library for debugging/inspecting machine learning classifiers and explaining their predictions

License

Notifications You must be signed in to change notification settings

eli5-org/eli5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ea5fb92 · Apr 20, 2025
Mar 22, 2025
Apr 18, 2025
Apr 20, 2025
Apr 18, 2025
Apr 18, 2025
Sep 20, 2016
Dec 5, 2024
Mar 26, 2025
Apr 20, 2025
Nov 24, 2016
Oct 12, 2016
Apr 18, 2025
Sep 23, 2016
Sep 24, 2016
Mar 26, 2025
Dec 12, 2016
Mar 22, 2025
Apr 18, 2025

Repository files navigation

ELI5

PyPI Version Build Status Code Coverage Documentation

ELI5 is a Python package which helps to debug machine learning classifiers and explain their predictions.

explain_prediction for text data

explain_prediction for image data

explain_weights for text data

It provides support for the following machine learning frameworks and packages:

  • scikit-learn. Currently ELI5 allows to explain weights and predictions of scikit-learn linear classifiers and regressors, print decision trees as text or as SVG, show feature importances and explain predictions of decision trees and tree-based ensembles. ELI5 understands text processing utilities from scikit-learn and can highlight text data accordingly. Pipeline and FeatureUnion are supported. It also allows to debug scikit-learn pipelines which contain HashingVectorizer, by undoing hashing.
  • Keras - explain predictions of image classifiers via Grad-CAM visualizations.
  • xgboost - show feature importances and explain predictions of XGBClassifier, XGBRegressor and xgboost.Booster.
  • LightGBM - show feature importances and explain predictions of LGBMClassifier, LGBMRegressor and lightgbm.Booster.
  • CatBoost - show feature importances of CatBoostClassifier, CatBoostRegressor and catboost.CatBoost.
  • lightning - explain weights and predictions of lightning classifiers and regressors.
  • sklearn-crfsuite. ELI5 allows to check weights of sklearn_crfsuite.CRF models.
  • OpenAI python client. ELI5 allows to explain LLM predictions with token probabilities.

ELI5 also implements several algorithms for inspecting black-box models (see Inspecting Black-Box Estimators):

  • TextExplainer allows to explain predictions of any text classifier using LIME algorithm (Ribeiro et al., 2016). There are utilities for using LIME with non-text data and arbitrary black-box classifiers as well, but this feature is currently experimental.
  • Permutation importance method can be used to compute feature importances for black box estimators.

Explanation and formatting are separated; you can get text-based explanation to display in console, HTML version embeddable in an IPython notebook or web dashboards, a pandas.DataFrame object if you want to process results further, or JSON version which allows to implement custom rendering and formatting on a client.

License is MIT.

Check docs for more.

Note

This project was previously developed at https://github.com/TeamHG-Memex/eli5/ with support from Hyperion Gray.