data requirements
no special requirements
Works with standard inputs without special requirements
43 techniques
Goals | Models | Data Types | Description | |||
---|---|---|---|---|---|---|
SHapley Additive exPlanations | Algorithmic | Model Agnostic | Any | SHAP explains model predictions by quantifying how much each input feature contributes to the outcome. It assigns an... | ||
Permutation Importance | Algorithmic | Model Agnostic | Any | Permutation Importance quantifies a feature's contribution to a model's performance by randomly shuffling its values and... | ||
Mean Decrease Impurity | Algorithmic | Tree Based | Tabular | Mean Decrease Impurity (MDI) quantifies a feature's importance in tree-based models (e.g., Random Forests, Gradient... | ||
Coefficient Magnitudes (in Linear Models) | Metric | Linear Model | Tabular | Coefficient Magnitudes assess feature influence in linear models by examining the absolute values of their coefficients.... | ||
Contextual Decomposition | Algorithmic | Recurrent Neural Network | Text | Contextual Decomposition explains LSTM and RNN predictions by decomposing the final hidden state into contributions from... | ||
Sobol Indices | Algorithmic | Model Agnostic | Any | Sobol Indices quantify how much each input feature contributes to the total variance in a model's predictions through... | ||
Local Interpretable Model-Agnostic Explanations | Algorithmic | Model Agnostic | Any | LIME (Local Interpretable Model-agnostic Explanations) explains individual predictions by approximating the complex... | ||
Ridge Regression Surrogates | Algorithmic | Model Agnostic | Any | This technique approximates a complex model by training a ridge regression (a linear model with L2 regularization) on... | ||
Partial Dependence Plots | Algorithmic | Model Agnostic | Any | Partial Dependence Plots show how changing one or two features affects a model's predictions on average. The technique... | ||
Individual Conditional Expectation Plots | Visualization | Model Agnostic | Any | ICE plots display the predicted output for individual instances as a function of a feature, with all other features held... | ||
Occlusion Sensitivity | Algorithmic | Model Agnostic | Image | Occlusion sensitivity tests which parts of the input are important by occluding (masking or removing) them and seeing... | ||
Factor Analysis | Algorithmic | Model Agnostic | Tabular | Factor analysis is a statistical technique that identifies latent variables (hidden factors) underlying observed... | ||
Principal Component Analysis | Algorithmic | Model Agnostic | Any | Principal Component Analysis transforms high-dimensional data into a lower-dimensional representation by finding the... | ||
t-SNE | Visualization | Model Agnostic | Any | t-SNE (t-Distributed Stochastic Neighbour Embedding) is a non-linear dimensionality reduction technique that creates 2D... | ||
UMAP | Visualization | Model Agnostic | Any | UMAP (Uniform Manifold Approximation and Projection) is a non-linear dimensionality reduction technique that creates 2D... | ||
Prototype and Criticism Models | Algorithmic | Model Agnostic | Any | Prototype and Criticism Models provide data understanding by identifying two complementary sets of examples: prototypes... | ||
Contrastive Explanation Method | Algorithmic | Model Agnostic | Any | The Contrastive Explanation Method (CEM) explains model decisions by generating contrastive examples that reveal what... | ||
ANCHOR | Algorithmic | Model Agnostic | Any | ANCHOR generates high-precision if-then rules that explain individual predictions by identifying the minimal set of... | ||
RuleFit | Algorithmic | Model Agnostic | Any | RuleFit is a method that creates an interpretable model by combining linear terms with decision rules. It first extracts... | ||
Monte Carlo Dropout | Algorithmic | Neural Network | Any | Monte Carlo Dropout estimates prediction uncertainty by applying dropout (randomly setting neural network weights to... |
Rows per page
Page 1 of 3