1 posts tagged with evaluation.
Displaying 1 through 1 of 1. Subscribe:

mccf1: R package for the MCC-F1 curve

Many fields use the receiver operating characteristic (ROC) curve and the precision-recall (PR) curve as standard evaluations of binary classification methods. Analysis of ROC and PR, however, often gives misleading and inflated performance evaluations, especially with an imbalanced ground truth. In our preprint, "The MCC-F1 curve: a performance evaluation technique for binary classification", we propose the MCC-F1 curve to address these drawbacks. The MCC-F1 curve combines two informative single-threshold metrics, Matthews correlation coefficient (MCC) and the F1 score. The MCC-F1 curve more clearly differentiates good and bad classifiers, even with imbalanced ground truths. We also introduce the MCC-F1 metric, which provides a single value that integrates many aspects of classifier performance across the whole range of classification thresholds. This project is an R package that plots MCC-F1 curves and calculates related metrics.
posted by grouse on Aug 14, 2020 - 0 comments

Page: 1