FeatureImportancePlotter#

class openstef_models.explainability.FeatureImportancePlotter(**data: Any) None[source]#

Bases: BaseConfig

Creates treemap visualizations of feature importance scores.

Parameters:

data (Any)

static plot(scores: DataFrame, quantile: Quantile = Q(0.5)) Figure[source]#

Generate interactive treemap showing feature importance.

Creates a color-coded treemap where each box size and color intensity represents the relative importance of a feature. Useful for quickly identifying which features contribute most to model predictions.

Parameters:
  • scores (DataFrame) – Feature importance scores with feature names as index and quantiles as columns (e.g., ‘q0.5’, ‘q0.95’). Values should be normalized to sum to 1.0.

  • quantile (Quantile) – Which quantile column to visualize. Defaults to median (0.5).

  • scores

  • quantile

Returns:

Plotly Figure containing interactive treemap with hover information. Larger boxes and darker green colors indicate higher importance.

Return type:

Figure

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': False, 'extra': 'ignore', 'protected_namespaces': ()}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].