Can the Precision, Recall and F1 be the same value?
Yes, this is possible. Let's assume binary classification with
The trivial solution to Pr = Re = F1
is TP = 0
. So we know precision, recall and F1 can have the same value in general. Now, this does not apply to your specific result. If we solve the system of equations, we find another solution: FP = FN
. So, if the number of false positives is the same as the number of false negatives, all three metrics have identical values.
For multiclass classification problems we have
If Pr = Re
, again all three metrics are identical.
This seems to be because of the option - average='weighted'
Refer: https://scikit-learn.org/stable/modules/generated/sklearn.metrics.precision_recall_fscore_support.html
'weighted': Calculate metrics for each label, and find their average weighted by support (the number of true instances for each label). This alters ‘macro’ to account for label imbalance; it can result in an F-score that is not between precision and recall.