snorkel.analysis.metric_score¶
-
snorkel.analysis.
metric_score
(golds=None, preds=None, probs=None, metric='accuracy', filter_dict=None, **kwargs)[source]¶ Evaluate a standard metric on a set of predictions/probabilities.
- Parameters
golds (
Optional
[ndarray
]) – An array of gold (int) labelspreds (
Optional
[ndarray
]) – An array of (int) predictionsprobs (
Optional
[ndarray
]) – An [n_datapoints, n_classes] array of probabilistic (float) predictionsmetric (
str
) – The name of the metric to calculatefilter_dict (
Optional
[Dict
[str
,List
[int
]]]) – A mapping from label set name to the labels that should be filtered out for that label set
- Returns
The value of the requested metric
- Return type
float
- Raises
ValueError – The requested metric is not currently supported
ValueError – The user attempted to calculate roc_auc score for a non-binary problem