{"id":7310,"library":"interpret","title":"InterpretML","description":"InterpretML is an open-source Python library designed for training inherently interpretable models (glassbox models) and explaining black-box machine learning systems. It provides a unified API for various interpretability techniques, including Explainable Boosting Machines (EBMs), LIME, and SHAP, along with interactive visualizations to help users understand model behavior globally and locally. The library is actively maintained, with frequent minor releases (e.g., several in early 2026), and is currently at version 0.7.8.","status":"active","version":"0.7.8","language":"en","source_language":"en","source_url":"https://github.com/interpretml/interpret","tags":["machine-learning","xai","interpretability","ebm","shap","lime","model-explanation","glassbox","blackbox"],"install":[{"cmd":"pip install interpret","lang":"bash","label":"Full installation"},{"cmd":"pip install interpret-core","lang":"bash","label":"Minimal EBM-only installation"},{"cmd":"pip install interpret[shap]","lang":"bash","label":"Install with SHAP support"},{"cmd":"pip install interpret[lime]","lang":"bash","label":"Install with LIME support"}],"dependencies":[{"reason":"Commonly used for data handling with InterpretML models.","package":"pandas","optional":true},{"reason":"Underlying numerical operations, often used with pandas.","package":"numpy","optional":true},{"reason":"Provides familiar API for glassbox models and utilities.","package":"scikit-learn","optional":true},{"reason":"For SHapley Additive exPlanations, requires `interpret[shap]` install.","package":"shap","optional":true},{"reason":"For Local Interpretable Model-agnostic Explanations, requires `interpret[lime]` install.","package":"lime","optional":true}],"imports":[{"symbol":"ExplainableBoostingClassifier","correct":"from interpret.glassbox import ExplainableBoostingClassifier"},{"symbol":"ExplainableBoostingRegressor","correct":"from interpret.glassbox import ExplainableBoostingRegressor"},{"note":"`show` is a top-level function for displaying visualizations, imported directly.","wrong":"import interpret.show","symbol":"show","correct":"from interpret import show"},{"symbol":"ShapKernel","correct":"from interpret.blackbox import ShapKernel"},{"symbol":"PartialDependence","correct":"from interpret.blackbox import PartialDependence"}],"quickstart":{"code":"import numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom interpret.glassbox import ExplainableBoostingClassifier\nfrom interpret import show\nfrom sklearn.datasets import load_iris\n\n# Load data\ndata = load_iris()\nX = pd.DataFrame(data.data, columns=data.feature_names)\ny = pd.Series(data.target)\n\n# Split data\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42)\n\n# Train an Explainable Boosting Machine (EBM) classifier\nebm = ExplainableBoostingClassifier(random_state=42)\nebm.fit(X_train, y_train)\n\n# Get a global explanation for the model\nebm_global = ebm.explain_global()\n\n# Display the global explanation (typically in a Jupyter environment)\n# show(ebm_global)\n\nprint(\"Model training complete. To view explanations, uncomment 'show(ebm_global)' in a Jupyter environment.\")","lang":"python","description":"This quickstart demonstrates how to train a glassbox model, specifically an Explainable Boosting Machine (EBM) Classifier, and generate a global explanation using the `interpret` library. It uses the Iris dataset, performs a train-test split, trains the EBM, and then calls `explain_global()` to get model insights. The `show()` function is used for interactive visualization, typically within a Jupyter notebook."},"warnings":[{"fix":"Update your code to pass `bags` with the shape `(n_samples, n_outer_bags)`. Version 0.7.0 issues a warning and accepts the old format, but future versions may remove this compatibility.","message":"The shape of the `bags` parameter in EBM fitting functions changed from `(n_outer_bags, n_samples)` to `(n_samples, n_outer_bags)`.","severity":"breaking","affected_versions":">=0.7.0"},{"fix":"Replace calls to `EBMUtils.merge_models` with `interpret.ebm.merge_ebms`.","message":"The `EBMUtils.merge_models` function was renamed to `merge_ebms`.","severity":"breaking","affected_versions":"<0.5.0 (likely 0.3.0)"},{"fix":"Upgrade `interpret` to version 0.7.4 or later, which includes a fix for this incompatibility.","message":"Incompatibility with scikit-learn 1.8+ due to changes in `is_classifier` and `is_regressor` only accepting valid estimators.","severity":"gotcha","affected_versions":"0.7.0 - 0.7.3"},{"fix":"If you were explicitly using `ComputeProvider`, you may need to adjust your code as this abstraction has been removed to simplify the API.","message":"The `ComputeProvider` abstraction was removed, simplifying the interface.","severity":"gotcha","affected_versions":"0.7.3 and later"},{"fix":"Ensure the `feature_names` property is explicitly set when initializing the explainer or manually before calling an explain function. This is automatically handled when using Pandas DataFrames.","message":"When passing NumPy arrays to explainers, feature names are not automatically inferred. Visualizations might lack descriptive labels.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Upgrade `interpret` to version 0.7.3 or later. If the issue persists, ensure your data types are compatible with NumPy operations expected by the explainer.","cause":"This error often occurs when older versions of `interpret` (e.g., <0.7.3) are used with specific data types, leading to a UFuncTypeError in `ShapKernel.explain_local`.","error":"TypeError: 'numpy.ufunc' object is not callable"},{"fix":"Update `interpret` to version 0.7.4 or newer. This version specifically resolved the issue by adapting to the scikit-learn changes.","cause":"This specific error indicates an incompatibility with scikit-learn versions 1.8 and above, where validation functions were tightened.","error":"TypeError: is_classifier or is_regressor only accepting valid estimators"},{"fix":"Replace `interpret.ebm.ebm_utils.merge_models` with `interpret.ebm.merge_ebms` in your code.","cause":"The `merge_models` function in `EBMUtils` was renamed to `merge_ebms` in a past breaking change.","error":"AttributeError: module 'interpret.ebm.ebm_utils' has no attribute 'merge_models'"},{"fix":"Remove any explicit references to `ComputeProvider` as it is no longer part of the public interface. The functionality is now handled internally.","cause":"The `ComputeProvider` abstraction was removed in version 0.7.3 to simplify the API.","error":"NameError: name 'ComputeProvider' is not defined"}]}