precision_recall#
- openstef_beam.metrics.precision_recall(cm: ConfusionMatrix, *, effective: bool = False) PrecisionRecall[source]#
Calculate precision and recall metrics from a confusion matrix.
These metrics evaluate the quality of peak detection by measuring how many predicted peaks were correct (precision) and how many actual peaks were detected (recall).
- Parameters:
cm (ConfusionMatrix) – Confusion matrix from the confusion_matrix function containing all classification outcomes.
effective (bool) – If True, uses effective true positives which account for prediction direction and magnitude. If False, uses standard true positives for calculation.
- Returns:
PrecisionRecall containing precision and recall values, each in range [0, 1].
- Return type:
Example
Calculate standard precision and recall:
>>> import numpy as np >>> y_true = np.array([100, 150, 80, 200, 90]) >>> y_pred = np.array([105, 145, 85, 195, 95]) >>> cm = confusion_matrix(y_true, y_pred, limit_pos=120, limit_neg=85) >>> pr = precision_recall(cm) >>> float(pr.precision) 1.0 >>> float(pr.recall) 1.0
Note
When effective=True, the metrics focus on predictions that provide actionable information (correct magnitude and direction) rather than just correct classification.
- Parameters:
cm (
ConfusionMatrix)effective (
bool)
- Return type: