Api Reference
Documentation for ClassificationMetrics.
ClassificationMetrics.AggregationSubset
— Typestruct AggregationSubset{T1, T2, T3}
Aggregation subset to be used in classification_report
.
aggregation
: Aggregation functionpredicate
: Predicate that defines which subset to aggregatename
: Name to be displayed in the classification report
ClassificationMetrics.ConfusionMatrix
— Typestruct ConfusionMatrix{T1, T2}
Confusion matrix struct that holds the matrix and label names.
label_set
: Class labelsmatrix
: Confusion matrix
ClassificationMetrics.PredictionResults
— Typestruct PredictionResults{T1, T2}
Holds TP, FP, TN, and FN values for further metric calculation. Use prediction_results
to create.
label_set
: Class labelsTP
: True positivesFN
: False negativesFP
: False positivesTN
: True negatives
ClassificationMetrics.F1_score
— MethodF1_score(TP, FN, FP, TN) = safe_div(2TP, 2TP + FN + FP)
ClassificationMetrics.Fβ_score
— MethodFβ_score(TP, FN, FP, TN; β=1) = safe_div((1 + β^2)TP, (1 + β^2)TP + β^2 * FN + FP)
ClassificationMetrics.IoU
— Methodjaccard(TP, FN, FP, TN) = safe_div(TP, TP + FN + FP)
IoU(TP, FN, FP, TN) = safe_div(TP, TP + FN + FP)
ClassificationMetrics.aggregate_subset
— Methodaggregate_subset(aggregation_function;
predicate=const_func(true),
name=get_print_name(aggregation_function))
Create AggregationSubset
to be used in classification_report
.
ClassificationMetrics.apply_metric
— Methodapply_metric(metric, TP, FN, FP, TN; aggregation=default_aggregation(), kws...)
apply_metric(metric, pr::PredictionResults; aggregation=default_aggregation(), kws...)
apply_metric(metric, confusion_matrix::ConfusionMatrix; aggregation=default_aggregation(), kws...)
apply_metric(metric, predicted, classes; aggregation=default_aggregation() label_set=nothing, sort_labels=false, kws...)
Applies metric
defined as a function of TP
, FN
, FP
, TN
with a given aggregation strategy. It is assumed that the metric is defined for a single class, and apply_metric
should be used for multiclass setting. kws
keywords can be used to pass additional parameters to the metric, e.g. β for the Fβ-score
Note: all metrics defined with @metric
macro are automatically extended, so they can be called in multiclass setting without calling this function explicitly.
ClassificationMetrics.binary_accuracy
— Methodbinary_accuracy(TP, FN, FP, TN) = safe_div(TP + TN, TP + FP + TN + FN)
ClassificationMetrics.classification_report
— Methodclassification_report(predicted, actual; label_set=nothing, sort_labels=true, kws...)
classification_report(cf::ConfusionMatrix; kws...)
classification_report(prediction_results::PredictionResults; kws...)
Print classification report.
Keywords
metrics=[precision, recall, F1_score]
: metrics to computeshow_per_class=true
: show results per each class or only the aggregated resultsaggregations=[micro_aggregation, macro_aggregation, weighted_aggregation]
: aggregations
to be used. If the aggregation function is passed, it will be applied to all classes. Alternatively, it is possible to aggregate only a subset of classes by passing AggregationSubset
created with aggregate_subset
.
io::IO | String | HTML = stdout
: io to print out. Refer toPrettyTables
for more information.include_support=true
: print support column or notdefault_format=val->@sprintf("%.4f", val)
: string format for the valuesbackend=Val(:text)
: the backend used to generate table.
Can be Val(:text), Val(:html), Val(:latex), Val(:markdown). Refer to PrettyTables
for more information.
optional_kws = DEFAULT_PARAMETERS[backend]: optional keywords to pass to
pretty_table`
ClassificationMetrics.confusion_matrix
— Functionconfusion_matrix(predicted, actual[, label_set]; sort_labels=false)
confusion_matrix(predicted::OneHotLike, actual::OneHotLike, label_set)
Returns a ConfusionMatrix
object that holds both the matrix and ladel set. predicted
and actual
can be provided as vectors of labels or in the one-hot encoded form. Optionally, label_set
can be provided explicitly.
ClassificationMetrics.default_aggregation
— Methoddefault_aggregation()
Returns aggregation used by default by all metrics. The default is weighted_aggregation
. Can be changed with set_default_aggregation!
.
ClassificationMetrics.fall_out
— Methodfall_out(TP, FN, FP, TN) = safe_div(FP, FP + TN)
false_positive_rate(TP, FN, FP, TN) = safe_div(FP, FP + TN)
ClassificationMetrics.false_negative_rate
— Methodmiss_rate(TP, FN, FP, TN) = safe_div(FN, TP + FN)
false_negative_rate(TP, FN, FP, TN) = safe_div(FN, TP + FN)
ClassificationMetrics.false_positive_rate
— Methodfall_out(TP, FN, FP, TN) = safe_div(FP, FP + TN)
false_positive_rate(TP, FN, FP, TN) = safe_div(FP, FP + TN)
ClassificationMetrics.get_label_set
— Methodget_label_set(xs...)
Return all unique labels from one or several containers with labels.
ClassificationMetrics.get_print_name
— Methodget_print_name(x)
Returns a "pretty" name to be used in the classification report table.
ClassificationMetrics.get_support
— Methodget_support(pr::PredictionResults)
Calculate support for each class.
ClassificationMetrics.jaccard
— Methodjaccard(TP, FN, FP, TN) = safe_div(TP, TP + FN + FP)
IoU(TP, FN, FP, TN) = safe_div(TP, TP + FN + FP)
ClassificationMetrics.macro_aggregation
— Methodmacro_aggregation(func, TP, FN, FP, TN)
One of possible aggregation options. Performs macro aggregation, i.e. sums values of all classes and applies the metric.
ClassificationMetrics.micro_aggregation
— Methodmicro_aggregation(func, TP, FN, FP, TN)
One of possible aggregation options. Performs micro aggregation, i.e. applies metric to each class and reports the mean.
ClassificationMetrics.miss_rate
— Methodmiss_rate(TP, FN, FP, TN) = safe_div(FN, TP + FN)
false_negative_rate(TP, FN, FP, TN) = safe_div(FN, TP + FN)
ClassificationMetrics.no_aggregation
— Methodno_aggregation(func, TP, FN, FP, TN)
One of possible aggregation options. Does not perform aggregation and instead returns metrics for each class.
ClassificationMetrics.precision
— Methodprecision(TP, FN, FP, TN) = safe_div(TP, TP + FP)
ClassificationMetrics.prediction_results
— Methodprediction_results(predicted, actual; label_set=nothing, sort_labels=false)
prediction_results(cm::ConfusionMatrix)
Calculate TP
, FN
, FN
and TN
and return a PredictionResults
object. Supports indexing and vcat
.
ClassificationMetrics.recall
— Methodrecall(TP, FN, FP, TN) = safe_div(TP, TP + FN)
sensitivity(TP, FN, FP, TN) = safe_div(TP, TP + FN)
true_positive_rate(TP, FN, FP, TN) = safe_div(TP, TP + FN)
ClassificationMetrics.rename_labels
— Methodrename_labels(pr::PredictionResults, new_labels)
rename_labels(pr::ConfusionMatrix, new_labels)
A convenience function to batch rename class labels.
ClassificationMetrics.sensitivity
— Methodrecall(TP, FN, FP, TN) = safe_div(TP, TP + FN)
sensitivity(TP, FN, FP, TN) = safe_div(TP, TP + FN)
true_positive_rate(TP, FN, FP, TN) = safe_div(TP, TP + FN)
ClassificationMetrics.set_default_aggregation!
— Methodset_default_aggregation!(aggregation)
Changes aggregation used by default by all metrics.
ClassificationMetrics.specificity
— Methodspecificity(TP, FN, FP, TN) = safe_div(TN, FP + TN)
ClassificationMetrics.support
— Methodsupport(TP, FN, FP, TN) = TP + FN
ClassificationMetrics.true_false_ratio
— Methodbinary_accuracy(TP, FN, FP, TN) = safe_div(TP + TN, TP + FP + TN + FN)
ClassificationMetrics.true_positive_rate
— Methodrecall(TP, FN, FP, TN) = safe_div(TP, TP + FN)
sensitivity(TP, FN, FP, TN) = safe_div(TP, TP + FN)
true_positive_rate(TP, FN, FP, TN) = safe_div(TP, TP + FN)
ClassificationMetrics.unlabeled_confusion_matrix
— Methodunlabeled_confusion_matrix(predicted, actual, label_set)
Returns a n × n
confusion matrix, where n
is the length of the label_set
. Function confusion_matrix
creates a ConfusionMatrix
object that holds both the matrix and the label set.
ClassificationMetrics.unlabeled_prediction_results
— Methodunlabeled_prediction_results(confusion_matrix)
Calculate and return TP
, FN
, FN
and TN
.
ClassificationMetrics.weighted_aggregation
— Methodweighted_aggregation(func, TP, FN, FP, TN; weights=TP .+ FN)
weighted_aggregation(weights; name="Weighted (Custom)")
One of possible aggregation options. Performs weighted aggregation, i.e. applies metric to each class and reports the weighted mean. By default, uses support of each class as the weight, but allows passing custom weights. weighted_aggregation(weights; name="Weighted (Custom)")
returns an aggregation function with fixed weights, useful for passing to classification_report
.
ClassificationMetrics.@metric
— Macro@metric ["Metric print name"] metric_function(TP, FN, FP, TN) = ...
@metric ["Metric print name"] function metric_function(TP, FN, FP, TN)
...
end
@metric ["Metric 1 print name"] metric1_function ["Metric 2 print name"] metric2_function ...
A macro to automatically define aggregation calls for a metric defined for a single class. The metric should have signature metric(TP, FN, FP, TN)
where TP
, FN
, FP
and TN
are all scalars.
It is also possible to optionally define a print name for a metric to be used for printing classification report.